Skip to main content

Design Approach May Help Fix Bias in Artificial Intelligence

woman types on computer

For Immediate Release

Bias in artificial intelligence (AI) and machine learning programs is well established. Researchers from North Carolina State University and Pennsylvania State University are now proposing that software developers incorporate the concept of “feminist design thinking” into their development process as a way of improving equity – particularly in the development of software used in the hiring process.

“There seem to be countless stories of ways that bias in AI is manifesting itself, and there are many thought pieces out there on what contributes to this bias,” says Fay Payton, a professor of information systems/technology and University Faculty Scholar at NC State. “Our goal here was to put forward guidelines that can be used to develop workable solutions to algorithm bias against women, African American and Latinx professions in the IT workforce.

“Too many existing hiring algorithms incorporate de facto identity markers that exclude qualified candidates because of their gender, race, ethnicity, age and so on,” says Payton, who is co-lead author of a paper on the work. “We are simply looking for equity – that job candidates be able to participate in the hiring process on an equal footing.”

Payton and her collaborators argue that an approach called feminist design thinking could serve as a valuable framework for developing software that reduces algorithmic bias in a meaningful way. In this context, the application of feminist design thinking would mean incorporating the idea of equity into the design of the algorithm itself.

“Compounding the effects of algorithmic bias is the historical underrepresentation of women, Black and Latinx software engineers to provide novel insights regarding equitable design approaches based on their lived experiences,” says Lynette Yarger, co-lead author of the paper and an associate professor of information sciences and technology at Penn State.

“Essentially, this approach would mean developing algorithms that value inclusion and equity across gender, race and ethnicity,” Payton says. “The practical application of this is the development and implementation of a process for creating algorithms in which designers are considering an audience that includes women, that includes Black people, that includes Latinx people. Essentially, developers of all backgrounds would be called on to actively consider – and value – people who are different from themselves.

“To be clear, this is not just about doing something because it is morally correct. But we know that women, African Americans and Latinx people are under-represented in IT fields. And there is ample evidence that a diverse, inclusive workforce improves a company’s bottom line,” Payton says. “If you can do the right thing and improve your profit margin, why wouldn’t you?”

The paper, “Algorithmic equity in the hiring of underrepresented IT job candidates,” is published in the journal Online Information Review. The paper was co-authored by Bikalpa Neupane of Penn State.

-shipman-

Note to Editors: The study abstract follows.

“Algorithmic equity in the hiring of underrepresented IT job candidates”

Authors: Lynette Yarger and Bikalpa Neupane, Pennsylvania State University; Fay Cobb Payton, North Carolina State University

Published: Dec. 17, 2019, Online Information Review

DOI: 10.1108/OIR-10-2018-0334

Abstract:
Purpose: The purpose of this paper is to offer a critical analysis of talent acquisition software and its potential for fostering equity in the hiring process for underrepresented IT professionals. The under-representation of women, African-American and Latinx professionals in the IT workforce is a longstanding issue that contributes to and is impacted by algorithmic bias.

Design/methodology/approach: Sources of algorithmic bias in talent acquisition software are presented. Feminist design thinking is presented as a theoretical lens for mitigating algorithmic bias.

Findings: Data are just one tool for recruiters to use; human expertise is still necessary. Even well-intentioned algorithms are not neutral and should be audited for morally and legally unacceptable decisions. Feminist design thinking provides a theoretical framework for considering equity in the hiring decisions made by talent acquisition systems and their users.

Social implications: This research implies that algorithms may serve to codify deep-seated biases, making IT work environments just as homogeneous as they are currently. If bias exists in talent acquisition software, the potential for propagating inequity and harm is far more significant and widespread due to the homogeneity of the specialists creating artificial intelligence (AI) systems.

Originality/value: This work uses equity as a central concept for considering algorithmic bias in talent acquisition. Feminist design thinking provides a framework for fostering a richer understanding of what fairness means and evaluating how AI software might impact marginalized populations.