When AI encyclopedias copy and change Wikipedia

global (English Wikipedia content)Sat May 16 2026
A new AI tool called Grokipedia arrived with big claims. It promised to fix Wikipedia’s problems by being more honest and less biased. But does it actually work? To find out, researchers compared 17, 790 articles from both sites. They picked the most edited Wikipedia pages and checked Grokipedia’s versions. The results were mixed. Grokipedia articles were longer and used more complex sentences. They also included fewer sources for every word. When researchers checked how similar the two were, they found something surprising. Some articles stayed almost the same. Others changed completely. The biggest differences showed up in topics like religion and history. Grokipedia seemed to lean more to the right when choosing which news sites to cite.
This doesn’t mean AI is useless. But it raises questions. Does longer text mean better information? Why do some articles stay the same while others don’t? The way Grokipedia picks sources might be shaping what readers believe. If AI tools just expand stories without strong proof, can we trust them? The bigger issue isn’t just about this one tool. It’s about how we handle knowledge made by machines. Who controls it? How do we know it’s real? Wikipedia has rules and sources. AI tools might skip those steps to make things more interesting. That could be a problem.
https://localnews.ai/article/when-ai-encyclopedias-copy-and-change-wikipedia-48ced55c

actions