in

Wikipedia is using artificial intelligence to attract more contributors

Although a handful of restrictions have been put in place over the years to reduce online vandalism, people are still more or less free to create and modify Wikipedia pages however they please. While this openness is one of Wikipedia’s defining characteristics and greatest strengths, it also opens the door to misinformation and vandalism on its pages, and even with a small army of dedicated editors, it’s still a massive problem for the website. To make matters worse, Wikipedia’s efforts to crack down on that kind of thing has caused it to lose a significant chunk of its active contributors, as a simple mistake on an article can cause hours of work and writing to be automatically deleted by editors. To remedy this, Wikipedia has decided to use the power of artificial intelligence to make itself more contributor-friendly and efficient. 

Software trained to know the difference between an honest mistake and intentional vandalism is being rolled out in an effort to make editing Wikipedia less psychologically bruising. It was developed by the Wikimedia Foundation, the nonprofit organization that supports Wikipedia. One motivation for the project is a significant decline in the number of people considered active contributors to the flagship English-language Wikipedia: it has fallen by 40 percent over the past eight years, to about 30,000. Research indicates that the problem is rooted in Wikipedians’ complex bureaucracy and their often hard-line responses to newcomers’ mistakes, enabled by semi-automated tools that make deleting new changes easy. Aaron Halfaker, a senior research scientist at Wikimedia Foundation who helped diagnose that problem, is now leading a project trying to fight it, which relies on algorithms with a sense for human fallibility. His ORES system, for “Objective Revision Evaluation Service,” can be trained to score the quality of new changes to Wikipedia and judge whether an edit was made in good faith or not. Halfaker invented ORES in hopes of improving tools that help Wikipedia editors by showing recent edits and making it easy to undo them with a single click. The tools were invented to meet a genuine need for better quality control after Wikipedia became popular, but an unintended consequence is that new editors can find their first contributions wiped out without explanation because they unwittingly broke one of Wikipedia’s many rules.

What do you think?

Avatar of Lorie Wimble

Written by Lorie Wimble

Lorie is the "Liberal Voice" of Conservative Haven, a political blog, and has 2 astounding children. Find her on Twitter.

Leave a Reply

Your email address will not be published. Required fields are marked *

GIPHY App Key not set. Please check settings

China is banning drone-based deliveries in urban areas

Xiaomi’s all-metal laptop may launch sometime around Q2 2016