Bias toward the Accents of Virtual Assistants
Abstract
Language bias, both positive and negative, is a well-documented phenomenon exhibited among human interlocutors. We examine whether this bias is exhibited toward virtual assistants, specifically, Apple's Siri and Google Assistant, with various accents. We conducted three studies with different stimuli and designs to investigate U.S. English speakers' attitudes toward Google's British, Indian, and American voices and Apple's Irish, Indian, South African, British, Australian, and American voices. Analysis reveals consistently lower fluency ratings for Irish, Indian, and South African voices (compared with American) but no consistent results of bias related to competence, warmth, or willingness to interact. Moreover, participants often misidentified voices' countries of origin but correctly identified them as artificial. We conclude that this overall lack of bias may be due to two possibilities: lack of humanlikeness of the voices and lack of availability of nonstandardized voices and voices from countries toward which those in the United States typically show bias.
Recommended Citation
Hercula, Sarah, Daniel Shank, Jessica Cundiff, and David Wright. "Bias toward the Accents of Virtual Assistants." Journal of Language and Social Psychology, SAGE Publications, 2024.
The definitive version is available at https://doi.org/10.1177/0261927X241291611
Department(s)
English and Technical Communication
Second Department
Psychological Science
Keywords and Phrases
Google Assistant; language attitudes; language bias; Siri; virtual assistants
International Standard Serial Number (ISSN)
1552-6526; 0261-927X
Document Type
Article - Journal
Document Version
Citation
File Type
text
Language(s)
English
Rights
© 2024 SAGE Publications, All rights reserved.
Publication Date
01 Jan 2024
Comments
Missouri University of Science and Technology, Grant None