Workplace Facial Screening is a Bad Idea – Progressive.org

Artificial intelligence has been on the rise in workplaces for at least the past decade. From consumer algorithms to quantum computing, AIs uses have grown in type and scope.

There are a number of risks associated with this technology. One of the more troubling is the apparent racial bias one that assigns more negative emotions to Black people than white people, even when they are smiling.

One of the more recent advances in AI technologies is the ability to read emotions through facial and behavioral analysis. While the emotional AI technology has largely been implemented in marketing campaigns and health care, a growing number of high-profile companies are using it in hiring decisions.

Companies should stop this immediately.

There are a number of risks associated with this technology. One of the more troubling is the apparent racial bias one that assigns more negative emotions to Black people than white people, even when they are smiling.

For example, Microsofts Face API software scored Black faces as three times more contemptuous than white faces. This bias is obviously harmful in a number of ways, but its especially devastating to non-white professionals who are disadvantaged in their the ability to secure a job and progress within their field.

Any workplace that uses a hiring algorithm that disproportionately sees Black and brown people as worse emotionally will further drive workplace inequalities and discriminatory treatment.

According to a Washington Post report, more than 100 companies are currently using emotional AI, and this technology has already been used to assess millions of job applicants. Among the top-tier companies deploying emotional AI are Hilton, Dunkin Donuts, IBM and the Boston Red Sox.

Emotional AI recognition has been estimated to be at least a $20 billion market.

The technology uses facial recognition to analyze emotional and cognitive ability. Generally, an interviewee will answer preselected questions during a recorded video interview, and be assessed by the AI algorithm. The assessment provides a grade or score on various characteristics, including verbal skills, facial movements, and even emotional characteristicsall of which aim to predict how likely the candidate will succeed in a position before taking next steps.

Supporters of the technology argue that it removes human prejudice from the equation. But replacing human bias with an artificial one cant be the solution.

Moreover, companies tend to use emotional AI to screen for a very limited data set to decide who gets marked as employable. These limited data sets usually favor majority groups while ignoring minority ones. For example, if someones first language isnt English and they speak with an accent or if an applicant is disabled, they will more likely be earmarked as less employable.

The technology can also work to the disadvantage of women.

For starters, much of the AI technology fails to properly identify women even iconic women such as Oprah Winfrey and Michelle Obama. Many examples have shown that, particularly in fields that are already male dominated, women applicants are downgraded and less likely to be recommended than male applicants.

There are a plethora of other anecdotes that highlight the biases of emotional AI, even outside the workplace. These include cameras that identify Asian faces as blinking and software that misgenders those with darker skin.

Of course, companies have been warned of the ongoing biases and have so far ignored them; many still use software like HireVue, which Princeton Professor of Computer Science Arvind Narayanan described as a bias perpetuation engine. Research institute AI Now, based at New York University, has called for a complete ban on emotional AI tech.

Until emotional AI is shown to be free of racial and gender biases, its unsafe for use in a world already struggling to overcome inequalities. If companies want to assist in that struggle, they should end the use of emotional AI in the workplace.

This column was produced for the Progressive Media Project, which is run by The Progressive magazine, and distributed by Tribune News Service.

September 17, 2020

12:32 PM

More:
Workplace Facial Screening is a Bad Idea - Progressive.org

Related Posts
This entry was posted in $1$s. Bookmark the permalink.