1. Gender stereotypes and voice assistants: do users' gender and conversation topic matter?
- Author
-
Dogruel, Leyla and Joeckel, Sven
- Subjects
- *
AUTOMATIC speech recognition , *COMMUNITY support , *GENDER role , *STEREOTYPES , *CONVERSATION , *GENDER identity , *PROMPTS (Psychology) , *CRONBACH'S alpha , *QUESTIONNAIRES , *STATISTICAL sampling , *UNDERGRADUATES , *SOCIAL theory , *JUDGMENT sampling , *DESCRIPTIVE statistics , *RANDOMIZED controlled trials , *ANALYSIS of covariance , *MULTIVARIATE analysis , *EXPERIMENTAL design , *IMPLICIT bias , *ANALYSIS of variance , *RESEARCH methodology , *USER interfaces - Abstract
Voice assistants (VAs), such as Alexa, Siri, or Google Now, are becoming increasingly ubiquitous. Consequently, the potential societal impacts of such systems are gaining relevance in public and academic discourses. Investigating the effects of VA voice features on user perception of VAs is one central aspect of research. Here, the potential effects of VAs on the perception of gender roles and the attribution of gender stereotypes stand out. Following the Computers as Social Actors paradigm, it is assumed that gendered voices have the potential to reproduce existing gender stereotypes. Yet, recently, the implementation of gender-ambiguous voices has been suggested to mitigate such effects and promote more diverse technology design. In this study, we set out to inquire about the relationship between gendered VA voice (male, female, gender-ambiguous) and gender stereotype activation. In two online surveys with an experimental manipulation (Study 1: N = 140, Study 2: N = 397), we test stereotype activation as a function of VA voice, user gender, and conversation topic. Empirical findings are mixed, with no effects for conversation topic but indications of small effects of VA voice. Implications for future research are discussed. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF