Discussion about this post

User's avatar
Dinah's avatar

I had ChatGPT generate an image of Canadian life.

In the background was the main street of a town that could be any small town near me. There was a large Canadian flag flying. So no one would mistake it was Canada.

It was set in the fall with a beautiful maple tree turning oranges and reds.

But then there were 5 people walking on the street. All white. All in winter coats.

So while the street could be any quaint idealized version of a Canadian town, it doesn't reflect the diversity in our people.

You wouldn't only see white people on the street. You would see people of all different colours and backgrounds. Also no one is wearing a winter jacket in the fall :-).

This was a fantastic article! Thanks for making me think about how AI perceives us and our cultures.

Expand full comment
Karen Smiley's avatar

Great insights and experiment, Rebecca! I'm running some of these experiments, asking LLMs to first describe in words what a picture of typical daily life would look like in <location> for multiple locations worldwide, then asking it to generate the image it described.

So far, Copilot is generating detailed and positive descriptions of all locations, but once outside of my home state, laptops don't get mentioned, and it loves lavender skies and women sweeping their porches. I'm definitely seeing the more rural stereotypes coming through.

Oddly, Copilot is also generating images which don't correspond to the descriptions it gave me, AND alt text strings that have little to do with the image it attached them to. 🙄

I have more scenarios and LLMs I want to run this through & will share results in an article in my Everyday Ethical AI newsletter when I'm done. 😊

Expand full comment
12 more comments...

No posts