Ethnography is traditionally a qualitative and inductive methodology that is now widely used to holistically investigate people’s lived experiences in and across cultures. In this talk, I define and discuss two ways of thinking about the role of ethnographic methods around computation, then discuss how my research relates to both.
The collection, curation, and analysis of data has always been as social as it is technical. As the statistical techniques and computational infrastructures of artificial intelligence and data science rapidly develop, we must continue to ground our understandings of data in context, drawing on the lived experiences of people who give that data meaning. But how do we bring human-centered perspectives and cultural contexts to data-intensive, highly-automated algorithmic decision-making? In this talk, I define and discuss two ways of thinking about ethnographic methods in relation to computer, information, and data science, then discuss how my research into various knowledge infrastructures and user-generated content platforms relates to both.
First, the ethnography of computation involves using traditional ethnographic methods (e.g. interviews, observation, participant-observation, case studies, and archival research) to study how people relate to computation and data in various ways. How do people design, develop, deploy, document, debate, maintain, manage, use, not use, learn, or teach computation and data in their everyday life and work? Second, computational ethnography involves extending ethnography’s traditionally qualitative methodological toolkit with computational methods. How can we conduct mixed-method scholarship in line with the broader epistemological principles that make ethnography a rich method for holistically investigating cultural phenomena? Both approaches bring key insights and collaborations to many classic and contemporary issues about information systems as socio-technical systems, letting us attend to data, information, and knowledge as it exists in particular organizational, institutional, social, cultural, economic, and political contexts.
R. Stuart Geiger
Former BIDS Ethnographer Stuart Geiger is now a faculty member at the University of California, San Diego, jointly appointed in the Department of Communication and the Halıcıoğlu Data Science Institute. At BIDS, as an ethnographer of science and technology, he studied the infrastructures and institutions that support the production of knowledge. He launched the Best Practices in Data Science discussion group in 2019, having been one of the original members of the MSDSE Data Science Studies Working Group. Previously, his work on Wikipedia focused on the community of volunteer editors who produce and maintain an open encyclopedia. He also studied distributed scientific research networks and projects, including the Long-Term Ecological Research Network and the Open Science Grid. In Wikipedia and scientific research, he studied topics including newcomer socialization, community governance, specialization and professionalization, quality control and verification, cooperation and conflict, the roles of support staff and technicians, and diversity and inclusion. And, as these communities are made possible through software systems, he studied how the design of software tools and systems intersect with all of these issues. He received an undergraduate degree at UT Austin, and an MA in Communication, Culture, and Technology at Georgetown University, where he began empirically studying communities using qualitative and ethnographic methods. As part of receiving his PhD from the UC Berkeley School of Information, he worked with anthropologists, sociologists, psychologists, historians, organizational and management scholars, designers, and computer scientists.