Service that uses AI to identify gender based on names looks incredibly biased – The Verge

Some tech companies make a splash when they launch, others seem to bellyflop.

Genderify, a new service that promised to identify someones gender by analyzing their name, email address, or username with the help AI, looks firmly to be in the latter camp. The company launched on Product Hunt last week, but picked up a lot of attention on social media as users discovered biases and inaccuracies in its algorithms.

Type the name Meghan Smith into Genderify, for example, and the service offers the assessment: Male: 39.60%, Female: 60.40%. Change that name to Dr. Meghan Smith, however, and the assessment changes to: Male: 75.90%, Female: 24.10%. Other names prefixed with Dr produce similar results while inputs seem to generally skew male. Test@test.com is said to be 96.90 percent male, for example, while Mrs Joan smith is 94.10 percent male.

The outcry against the service has been so great that Genderify tells The Verge its shutting down altogether. If the community dont want it, maybe it was fair, said a representative via email. Genderify.com has been taken offline and its free API is no longer accessible.

Although these sorts of biases appear regularly in machine learning systems, the thoughtlessness of Genderify seems to have surprised many experts in the field. The response from Meredith Whittaker, co-founder of the AI Now Institute, which studies the impact of AI on society, was somewhat typical. Are we being trolled? she asked. Is this a psyop meant to distract the tech+justice world? Is it cringey tech April fools day already?

The problem is not that Genderify made assumptions about someones gender based on their name. People do this all the time, and sometimes make mistakes in the process. Thats why its polite to find out how people self-identify and how they want to be addressed. The problem with Genderify is that it automated these assumptions; applying them at scale while sorting individuals into a male/female binary (and so ignoring individuals who identify as non-binary) while reinforcing gender stereotypes in the process (such as: if youre a doctor youre probably a man).

The potential harm of this depends on how and where Genderify was applied. If the service was integrated into a medical chatbot, for example, its assumptions about users genders might have led to the chatbot issuing misleading medical advice.

Thankfully, Genderify didnt seem to be aiming to automate this sort of system, but was primarily designed to be a marketing tool. As Genderifys creator, Arevik Gasparyan, said on Product Hunt: Genderify can obtain data that will help you with analytics, enhancing your customer data, segmenting your marketing database, demographic statistics, etc.

In the same comment section, Gasparyan acknowledged the concerns of some users about bias and ignoring non-binary individuals, but didnt offer any concrete answers.

One user asked: Lets say I choose to identify as neither Male or Female, how do you approach this? How do you avoid gender discrimination? How are you tackling gender bias? To which Gasparyan replied that the service makes its decisions based on already existing binary name/gender databases, and that the company was actively looking into ways of improving the experience for transgender and non-binary visitors by separating the concepts of name/username/email from gender identity. Its a confusing answer given that the entire premise of Genderify is that this data is a reliable proxy for gender identity.

The company told The Verge that the service was very similar to existing companies who use databases of names to guess an individuals gender, though none of them use AI.

We understand that our model will never provide ideal results, and the algorithm needs significant improvements, but our goal was to build a self-learning AI that will not be biased as any existing solutions, said a representative via email. And to make it work, we very much relied on the feedback of transgender and non-binary visitors to help us improve our gender detection algorithms as best as possible for the LGBTQ+ community.

Update Wednesday July 29, 12:42PM ET: Story has been updated to confirm that Genderify has been shut down and to add additional comment from a representative of the firm.

Read more:

Service that uses AI to identify gender based on names looks incredibly biased - The Verge

Related Posts

Comments are closed.