Journalism of Courage
Advertisement
Premium

Gemini AI accused of racism against white people, Google responds

Google's new AI tool Gemini is facing backlash for being too "woke" - depicting women and minorities even when generating historical figures like America's founders.

gemini harmful stereotype (1)Google has promised a fix for its new AI image generator Gemini. (Image: AI-generated by Gemini sourced from X)

Google’s new AI image generation capabilities on Gemini are receiving flak from X (formerly Twitter) users of late. The tool, which churns out pics based on text prompts, has apparently been overshooting on the ‘wokeness’ front – generating images depicting people of various ethnicities even when that clashes with historical accuracy.

These ‘glitches’ have kicked off something of a firestorm on X, with right-wingers calling Google out for perpetuating racism against white people.

For instance, when Gemini was asked by an X user to depict America’s founding fathers, the AI threw in women and people of colour, presumably in an attempt to increase diversity representation. But in cases dealing with established historical facts, this “inclusive” approach ends up being inaccurate and awkward.

“We’re aware that Gemini is offering inaccuracies in some historical image generation depictions,” says a Google statement posted on X. “We’re working to improve these kinds of depictions immediately. Gemini’s AI image generation does generate a wide range of people. And that’s generally a good thing because people around the world use it. But it’s missing the mark here.”

It’s not the first time an AI has bungled the diversity balance. Years ago, Google had to apologise after its photo app infamously labeled a pic of a Black couple as “gorillas,” according to a BBC report. More recently, OpenAI’s Dall-E image generator kept populating CEOs and other authority figures as white males when users didn’t specify gender or race.

On the flip side, overcorrecting for bias by injecting random diversity into every image could perpetuate its own harmful stereotypes and technical issues. Gemini’s current bug highlights the snags of using crude inclusion filters.

From the homepage
Tags:
Edition
Install the Express App for
a better experience
Featured
Trending Topics
News
Multimedia
Follow Us
Express PremiumFrom kings and landlords to communities and corporates: The changing face of Durga Puja
X