Google: Gemini AI now can’t depict people
Google said Thursday that it is temporarily stopping its Gemini artificial intelligence chatbot from generating images of people a day after apologizing for “inaccuracies” in historical depictions that it was creating.
Gemini users this week posted screenshots on social media of historically white-dominated scenes with racially diverse characters that they say it generated, leading critics to raise questions about whether the company is over-correcting for the risk of racial bias in its AI model.
“We’re already working to address recent issues with Gemini’s image generation feature,” Google said in a post on X. “While we do this, we’re going to pause the image generation of people and will re-release an improved version soon.”
Previous studies have shown AI image-generators can amplify racial and gender stereotypes found in their training data, and without filters are more likely to show lighter-skinned men when asked to generate a person in various contexts.
Google said Wednesday that it’s “aware that Gemini is offering inaccuracies in some historical image generation depictions” and that it’s “working to improve these kinds of depictions immediately.”
Gemini can generate a “wide range of people,” which the company said is “generally a good thing” because people around the world use the system but it is “missing the mark.”
University of Washington researcher Sourojit Ghosh, who has studied bias in AI image-generators, said he’s in favor of Google pausing the generation of people’s faces but is a “little conflicted about how we got to this outcome.” Contrary to claims of so-called white erasure and the premise that Gemini refuses to generate faces of white people — ideas circulating on social media this week — Ghosh’s research has largely found the opposite.
“The rapidness of this response in the face of a lot of other literature and a lot of other research that has shown traditionally marginalized people being erased by models like this — I find a little difficult to square,” he said.