Abstract

The hiring process is crucial for organizational success but has long been troubled by human biases. Many organizations now include AI in their hiring protocols to mitigate these biases and increase efficiency. However, AI itself can have biases baked-in. Human biases and AI biases are distinct but related; here, we examine how human and AI biases interact to affect hiring outcomes. Through an online experiment, we examine this question in the context of gendered hiring for a male-dominated leadership position in electrical engineering. The study tests how elevated and depressed AI recommendations for male and female job candidates affect participant evaluations of those candidates, moderated by participants' attitudes about gender. Findings show that all else constant, elevated AI recommendations increased participants' evaluations of candidates for both competence and likeability, while depressed AI recommendations decreased participants' ratings on both dimensions. However, the benefits of AI recommendations did not distribute evenly. High AI scores benefited male candidates more than female candidates. Ratings were also affected by participants' gender attitudes, revealing effects of sexism on hiring decisions, even when AI is involved. These preliminary findings offer insight into the intersection of human and AI biases as they influence hiring outcomes.

Department(s)

Psychological Science

Second Department

Engineering Management and Systems Engineering

Comments

Office of the Vice Chancellor for Research and Innovation, Grant None

Keywords and Phrases

algorithmic bias; decision aid; gender bias; human judgment; human-AI interaction

Document Type

Article - Conference proceedings

Document Version

Citation

File Type

text

Language(s)

English

Rights

© 2025 Institute of Electrical and Electronics Engineers, All rights reserved.

Publication Date

01 Jan 2025

Share

 
COinS