This talk is based on a multi-year ethnographic study of algorithmic software agents in Wikipedia, where bots and automated tools have fundamentally transformed the nature of the notoriously decentralized, ‘anyone can edit’ encyclopedia project. I studied how the development and operation of automated software agents intersected with the development of organizational structures and epistemic norms. My ethnography of infrastructure (Star, 1999) involved participant-observation in various spaces of Wikipedia. I discuss how algorithmic systems are deployed to enforce particular behavioral and epistemological standards in Wikipedia, which can become a site for collective sensemaking among veteran Wikipedians.
Recent research has revealed that political actors are using algorithms and automation in efforts to sway public opinion. In some circumstances, the ways coded automation interacts with or affects human users is unforeseeable-even by the software engineers who write such algorithms. In others, individuals and organizations work to build software that purposefully targets voters, activists, and political opponents. Politicized social bots are one version of potentially malicious automated programs, discriminatory algorithms are another. Understanding how technologies like these are used to spread propaganda, engage with citizens, and influence political outcomes are pressing problems for scholars of communication.
This preconference sought to address these problems and explore a broad range of interdisciplinary questions related to algorithms, automation, and politics. To date, what impact have automated scripts on global social media services had on political discussions and current affairs? Who produces these scripts, or what are the conditions in which innovations in computer science and engineering get repurposed for political means? Is there a demonstrable impact of algorithms and bots upon news consumption? What is the evolutionary trajectory of this field of computer science, and what are the mechanisms for improving public literacy, generating careful policy oversight, and preventing the abuse of social networking technologies. It will be important to work with concrete case studies and examples of such manipulation, and it will be critical to draw theory from both political communication and science and technology studies to explain these cases.
Speaker(s)
R. Stuart Geiger
Former BIDS Ethnographer Stuart Geiger is now a faculty member at the University of California, San Diego, jointly appointed in the Department of Communication and the Halıcıoğlu Data Science Institute. At BIDS, as an ethnographer of science and technology, he studied the infrastructures and institutions that support the production of knowledge. He launched the Best Practices in Data Science discussion group in 2019, having been one of the original members of the MSDSE Data Science Studies Working Group. Previously, his work on Wikipedia focused on the community of volunteer editors who produce and maintain an open encyclopedia. He also studied distributed scientific research networks and projects, including the Long-Term Ecological Research Network and the Open Science Grid. In Wikipedia and scientific research, he studied topics including newcomer socialization, community governance, specialization and professionalization, quality control and verification, cooperation and conflict, the roles of support staff and technicians, and diversity and inclusion. And, as these communities are made possible through software systems, he studied how the design of software tools and systems intersect with all of these issues. He received an undergraduate degree at UT Austin, and an MA in Communication, Culture, and Technology at Georgetown University, where he began empirically studying communities using qualitative and ethnographic methods. As part of receiving his PhD from the UC Berkeley School of Information, he worked with anthropologists, sociologists, psychologists, historians, organizational and management scholars, designers, and computer scientists.