Wikipedia has become a dominant source of reference information for more than half a billion people every month. Through its improbable rise to popularity, this "free encyclopedia that anyone can edit" has become a synecdoche for open production communities online. In order to operate at massive scales (~160k edits per day), Wikipedians have embraced algorithmic technologies that bring efficiency and consistency to the wiki's complex, distributed processes. These algorithms mediate social processes, governance decisions, and editors' perceptions of each other. Specifically, so-called "black box" artificial intelligences have proven invaluable for supporting curation activities at scale, but they also have the potential to silence voices and introduce ideologically founded biases in insidious ways. Despite Wikipedians' open/audit-able processes, that's exactly what's been happening. In this talk, I'll introduce "ORES," an open AI platform that is designed to enable Wikipedia's technologists to enact alternative ideological visions and to enable researchers to easily perform audits. I'll share some lessons that we've learned maintaining a large-scale, generalized AI service and discuss a call to action direct towards critical algorithms researchers to take advantage of this platform for their studies.
Dr. Halfaker is a principal research scientist at the Wikimedia Foundation and a senior scientist at the University of Minnesota. He studies the intersection of advanced algorithmic technologies and social issues in open production communities (like Wikipedia) using a mixture of experimental engineering, data science, and ethnographic methods. His studies of Wikipedia's editor decline and his development of "ORES," an open AI platform, have received substantial attention from the technology press. http://halfaker.info