BIDS Faculty Affiliates Douglas Guilbeault and David Holtz, both assistant professors in the Berkeley Haas School of Business, have each been awarded a 2021-2022 grant from the Haas Center for Equity, Gender, and Leadership (EGAL), for research that focuses on research topics related to diversity, equity, and inclusion (view the full list of this year's grantees).
Guilbeault’s project – Identifying Stereotypes in Online Images and Their Implications for Algorithmic Bias – examines online gender stereotypes in image and text sources from Google and Wikipedia. Guilbeault is developing a novel method for measuring stereotypes in publicly searchable image datasets, and for comparing these stereotypes with those that arise in online texts from the same sources. Early results show that the frequency and salience of gender stereotypes are stronger and more common in image sources than in corresponding texts and articles, and more concerningly, that stereotypes in images may be even more influential than stereotypes in text in terms of priming biases in people’s beliefs about social representation across categories.
This study raises a challenging ethical question for the design of public algorithms and data repositories. On one hand, companies often market their algorithms as “knowledge engines” or ways of accessing accurate information about the world, however digital platforms too often use biased algorithms that are driven by market incentives. So, what is a truly “neutral” image? Guilbeault’s early results point to the need for fairness and neutrality in the representation of content, particularly how people and particular demographic traits and social categories are portrayed more broadly. This project will support efforts to identify and mitigate social biases in machine learning applications trained on public images, “social biases that are strikingly costly for companies and society at large.”
Holtz’s project – Cracking the Coding Interview – aims to deepen our understanding of what factors drive race and gender disparities in who enters STEM professions, and specifically focuses on bias in hiring processes. This study sets out to identify the micro-interactional behavioral routines that are associated with higher evaluations in technical job interviews for software engineering roles. Holtz and his collaborators believe that identifying these behavioral routines will deepen our understanding of the interactional dynamics that lead white male job seekers to gain disproportionate representation in STEM professions. This project – which aims to analyze audio, text, and code data from over 20,000 technical phone screens to characterize the behavioral dynamics that unfold during interviews for software engineering roles, and understand the extent to which these behavioral dynamics might contribute to gender and race disparities in hiring for STEM roles – also received $2,500 in research funding from the Berkeley Culture Initiative. The project is a collaboration between Holtz and Sanaz Mobasseri (BU Questrom), Janet Xu (Harvard) and Zanele Munyikwa (MIT Sloan).