- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
At least 80 million (3.3%) of Wikipedia’s facts are inconsistent, LLMs may help finding them
A paper titled “Detecting Corpus-Level Knowledge Inconsistencies in Wikipedia with Large Language Models”,[[1]](https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2025-12-01/Recent_research#cite_note-1) presented earlier this month at the EMNLP conference, examines
No, but it can make things a lot worse.
Hey I just quoted much the same where this was also crossposted haha
…said conditioned Lemmy user reflexively upon seeing mention of AI in the title.
Which is as logical as cringing when someone would mention blockchain features some time ago.
AI is not currently in a state (technical or social) that makes it ever useful.
“AI” covers a lot more than LLMs and much of it is quite useful. Figured out protein folding for example.
Even in this narrow case of LLMs, it’s still correctly pointing out flaws in wikipedia articles.



