Google apologizes for missing the mark after Gemini generated racially diverse Nazis – The Verge

Google has apologized for what it describes as inaccuracies in some historical image generation depictions with its Gemini AI tool, saying its attempts at creating a wide range of results missed the mark. The statement follows criticism that it depicted specific white figures (like the US Founding Fathers) or groups like Nazi-era German soldiers as people of color, possibly as an overcorrection to long-standing racial bias problems in AI.

Were aware that Gemini is offering inaccuracies in some historical image generation depictions, says the Google statement, posted this afternoon on X. Were working to improve these kinds of depictions immediately. Geminis AI image generation does generate a wide range of people. And thats generally a good thing because people around the world use it. But its missing the mark here.

Google began offering image generation through its Gemini (formerly Bard) AI platform earlier this month, matching the offerings of competitors like OpenAI. Over the past few days, however, social media posts have questioned whether it fails to produce historically accurate results in an attempt at racial and gender diversity.

As the Daily Dot chronicles, the controversy has been promoted largely though not exclusively by right-wing figures attacking a tech company thats perceived as liberal. Earlier this week, a former Google employee posted on X that its embarrassingly hard to get Google Gemini to acknowledge that white people exist, showing a series of queries like generate a picture of a Swedish woman or generate a picture of an American woman. The results appeared to overwhelmingly or exclusively show AI-generated people of color. (Of course, all the places he listed do have women of color living in them, and none of the AI-generated women exist in any country.) The criticism was taken up by right-wing accounts that requested images of historical groups or figures like the Founding Fathers and purportedly got overwhelmingly non-white AI-generated people as results. Some of these accounts positioned Googles results as part of a conspiracy to avoid depicting white people, and at least one used a coded antisemitic reference to place the blame.

Google didnt reference specific images that it felt were errors; in a statement to The Verge, it reiterated the contents of its post on X. But its plausible that Gemini has made an overall attempt to boost diversity because of a chronic lack of it in generative AI. Image generators are trained on large corpuses of pictures and written captions to produce the best fit for a given prompt, which means theyre often prone to amplifying stereotypes. A Washington Post investigation last year found that prompts like a productive person resulted in pictures of entirely white and almost entirely male figures, while a prompt for a person at social services uniformly produced what looked like people of color. Its a continuation of trends that have appeared in search engines and other software systems.

Some of the accounts that criticized Google defended its core goals. Its a good thing to portray diversity ** in certain cases **, noted one person who posted the image of racially diverse 1940s German soldiers. The stupid move here is Gemini isnt doing it in a nuanced way. And while entirely white-dominated results for something like a 1943 German soldier would make historical sense, thats much less true for prompts like an American woman, where the question is how to represent a diverse real-life group in a small batch of made-up portraits.

For now, Gemini appears to be simply refusing some image generation tasks. It wouldnt generate an image of Vikings for one Verge reporter, although I was able to get a response. On desktop, it resolutely refused to give me images of German soldiers or officials from Germanys Nazi period or to offer an image of an American president from the 1800s.

But some historical requests still do end up factually misrepresenting the past. A colleague was able to get the mobile app to deliver a version of the German soldier prompt which exhibited the same issues described on X.

And while a query for pictures of the Founding Fathers returned group shots of almost exclusively white men who vaguely resembled real figures like Thomas Jefferson, a request for a US senator from the 1800s returned a list of results Gemini promoted as diverse, including what appeared to be Black and Native American women. (The first female senator, a white woman, served in 1922.) Its a response that ends up erasing a real history of race and gender discrimination inaccuracy, as Google puts it, is about right.

Additional reporting by Emilia David

Read more from the original source:

Google apologizes for missing the mark after Gemini generated racially diverse Nazis - The Verge

Related Posts

Comments are closed.