This talk is part of a panel session titled “Demystifying Algorithmic Processes: What is the role of algorithms in online platforms, what can they do and not do, and how should they be governed?”
Summary: Lawrence Lessig says “code is law” (1999) because code has the force of law and puts programmers in situations where they have to make governmental decisions. I’m using the case of Wikipedia to argue that code can also be law in the sense of a text that is deliberated – something that looks more like public policy, which Wikipedians have been doing for over 10 years. Rather than draw a clear line between “program or be programmed,” I have seen a broader spectrum of participation in the spaces where Wikipedia’s algorithmic infrastructure is debated. People who do not have programming expertise (much less ML/AI expertise) can and do participate in discussions about what kind of automation they think should take place on Wikipedia. However, this requires more than a generic commitment to openness or the assumption that openness is achieved if everything is just open sourced in a public source code repository. Like with public policy, it requires specific kinds of processes, discourses, and structures of accountability and translation work, which come with their own expertises and barriers to participation.
Speaker(s)
R. Stuart Geiger
Former BIDS Ethnographer Stuart Geiger is now a faculty member at the University of California, San Diego, jointly appointed in the Department of Communication and the Halıcıoğlu Data Science Institute. At BIDS, as an ethnographer of science and technology, he studied the infrastructures and institutions that support the production of knowledge. He launched the Best Practices in Data Science discussion group in 2019, having been one of the original members of the MSDSE Data Science Studies Working Group. Previously, his work on Wikipedia focused on the community of volunteer editors who produce and maintain an open encyclopedia. He also studied distributed scientific research networks and projects, including the Long-Term Ecological Research Network and the Open Science Grid. In Wikipedia and scientific research, he studied topics including newcomer socialization, community governance, specialization and professionalization, quality control and verification, cooperation and conflict, the roles of support staff and technicians, and diversity and inclusion. And, as these communities are made possible through software systems, he studied how the design of software tools and systems intersect with all of these issues. He received an undergraduate degree at UT Austin, and an MA in Communication, Culture, and Technology at Georgetown University, where he began empirically studying communities using qualitative and ethnographic methods. As part of receiving his PhD from the UC Berkeley School of Information, he worked with anthropologists, sociologists, psychologists, historians, organizational and management scholars, designers, and computer scientists.