How to Fix Bias against Women and Latinos in Artificial Intelligence Algorithms – AL DIA News

Biases in artificial intelligence and machine learning programs are well established and are very similar to how we see the world today.

Researchers from North Carolina State University and Pennsylvania State University propose a change for artificial intelligence (AI) developers by incorporating the concept of "feminist thinking design" according to the article "Algorithmic Equity in Recruitment of Underrepresented IT Job Candidates. The research proposes that while building new AI programs, this way equity can be improved, particularly in the development of software used in recruitment processes.

"There are countless stories about the ways bias manifests itself in artificial intelligence, and there are many pieces of thinking about what contributes to this bias," Fay Payton, professor of information systems/technology on the faculty at the University of North Carolina, said in a news release.

For researchers at these universities, the goal is to propose guidelines that can help develop viable solutions to eliminate bias in algorithms against women, African Americans, and Latinos who are part of the workforce in information technology companies.

"Too many existing hiring algorithms de facto incorporate identity markers that exclude qualified candidates based on gender, race, ethnicity, age, etc.," says Payton, who is the lead co-author of the research. "We are simply looking for equity: that candidates can participate in the recruitment process on an equal basis.

Payton and her collaborators argue that a feminist design approach to thinking could serve as a valuable framework for developing software that significantly reduces algorithmic bias. In this context, applying this thinking would mean incorporating the idea of equity into the design of the algorithm itself.

"The effects of algorithmic bias are compounded by the historical under-representation of women and African-American and Latino software engineers who bring new ideas to equitable design approaches based on their life experiences," says Lynette Yarger, associate professor of information science and technology at Penn State.

The rest is here:
How to Fix Bias against Women and Latinos in Artificial Intelligence Algorithms - AL DIA News

Related Posts
This entry was posted in $1$s. Bookmark the permalink.