In Defense of Google’s Diversity Push

Over the last week or so, Google got in trouble for being too ‘woke.’ The GenAI team’s attempt to be more inclusive resulted in historical figures with a different skin color.

I am typically the first on the bandwagon to speak critically on Google’s product culture as I would love to see some major things changed. My inability to change it from the inside is part of why I left (and why I want to return).

But I want to take a moment to stand up for them. Yes, they jumped too soon, releasing a model that had (perhaps) a bias for being more inclusive. An AI-generated picture of a Black pope, and many started screaming about culture and Google’s significant failures. But I would like to ask for a moment that we all stop and give the team at Google a little credit.

How much of Black or Indigenous history was wiped off the map and never registered in books or other forms of recorded history? (Which means the AI models can’t train on this information.) If you are White, take an extra moment to think about how much you don’t know.

I work with underrepresented PMs daily. Every week, I send out a newsletter. It is exceedingly difficult for me to quickly find artwork for the top of the newsletter that shows Black faces. Before Midjourney it was almost impossible to find affordable options.

If I use Midjourney, I have to specifically ask for Black people or women. (But at least I can do that.) I love the new select and re-query feature because I can highlight the person and replace someone with Black women or black man. I am frustrated and horrified that when I prompt “somebody” 95% of the time I get a white man. At the very least, I would love to see the variation with 1 white man, 1 white woman, 1 Black man and 1 Black woman, as an example. But it never has a wide variety on “someone.” But if I specify Black person, they output is distinctly different from a general prompt. The treatment is rarely an improvement and changes the look and feel of what I want when I have to specify skin color.

The algorithms are designed to spit out the history that made it into the books and ignores so much of what was wiped out by those before us who were biased.

Yes, Google has a long way to go, but I actually applaud this recent error because they are trying to balance the scales. Everyone heard about the Black Pope error, but there are millions more queries that are frustrating in their lack of diversity that would actually result in the right answer that are getting lost in the cry over the high profile error.

Google, please keep trying to make sure GenAI doesn’t fall into the trap the vast majority of our history books fell into. Please keep trying until you get it right for everyone.

Previous
Previous

How to run effective 1:1 Meetings

Next
Next

The Case Against Specialization