Wikipedia editors adopt stronger rules against AI-generated content
Wikipedia editors have strengthened the site’s rules against AI-generated content.
The new guideline, which was overwhelmingly supported in a discussion involving more than 40 editors, says content generated by large language models cannot be used in new or existing articles. The previous guideline, which many editors had viewed as a placeholder, only prohibited AI to generate new articles from scratch.
The new guideline also clarifies when it is okay for editors to use LLMs. It says editors can use them “to suggest refinements to their own writing, and to incorporate some of them after human review, provided the LLM doesn’t introduce content of its own.”
Ilyas Lebleu, a Wikipedia editor who proposed the new guideline, said it addresses the growing problems the site is having with humans and AI agents using LLMs to create content.
In early March, a suspected bot named TomWikiAssist authored several articles and edited other pages, according to an account in The Wikipedian, a Substack newsletter.
“An AI agent can just run wild 24 hours per day,” Lebleu said. “It can cause disruption at a scale that is much larger than what a human editor can achieve, even with the help of LLMs.”
Hannah Clover, an editor who was Wikimedian of the Year in 2024, praised the new guideline because it is simple and clear. “LLM text has been really frowned upon for a while, but it’s good to have that officially be the case,” Clover said.
Barkeep49, an influential editor who wrote an early essay predicting Wikipedia’s challenges with AI, also praised the rule.
“It’s important to Wikipedians that our content be written by humans,” said Barkeep49, who, like many editors, prefers not to disclose his real name. “This new guideline, while certainly being more prohibitive than the previous guideline, still recognizes that large language models have utility. My greater hope is that we can get our readers to understand the advantages of a human- rather than AI-written encyclopedia.”
But David Lovett, a veteran editor who writes about Wikipedia in his Edit History Substack, said there’s more work to do. “The less AI-generated content the better. The Internet is already awash with slop. Wikipedia should do everything it can to stay clean. I like the spirit of this rule but I don’t think it’s strict enough.”
The post Wikipedia editors adopt stronger rules against AI-generated content appeared first on Reporters’ Lab.