According to their website, Rest of World, an organization that challenges expectations about whose experiences with technology matter, analyzed 3,000 Al images to see how image generators visualize different countries and cultures. However, the results were disappointing, to say the least.
Similarly, in a now-deleted post, Buzzfeed posted a list of 195 images of Barbie dolls produced using Midjourney, a popular artificial intelligence image generator. According to the post, each doll was supposed to represent a different country: Afghanistan Barbie, Algeria Barbie, etc.
However, like Rest of World’s AI-generated images, the depictions were clearly flawed. The Asian Barbies were light-skinned, and the Thailand Barbie, Singapore Barbie, and the Philippines Barbie all had blonde hair.
In addition, Germany Barbie wore military-style clothing, and South Sudan Barbie carried a gun, reducing the countries to lazy stereotypes.
These findings have led to discussions about AI contributing to stereotypes. While bias occurs in many algorithms, AI systems are no different. In an analysis of over 5,000 AI images, Bloomberg discovered that images associated with higher-paying job titles featured people with lighter skin tones. In addition, males dominated the results for most professional roles, showing a hugely stereotypical view of the world.
Rest of World embarked on a new analysis to further confirm this bias. For each prompt, the organization chose the concept of a person and country combination, such as an American woman. They also select things, like houses, and add a country to the prompt—for instance, a Mexican house.
However, these results were all stereotypical. For example, the result for a Mexican man was a man in a Sombrero hat. Cliche, right?
Amba Kak, executive director of the AI Now Institute, said, “Essentially what this is doing is flattening descriptions of, say, ‘an Indian person’ or a Nigerian house into particular stereotypes which could appear in a negative light.”
“It definitely doesn’t represent the complexity and the heterogeneity, the diversity of these cultures,” Sasha Luccioni, a researcher in ethical and sustainable Al at Hugging Face, also said.
One of the researchers said they reflect a particular value judgment and lack diversity. Nigeria, for instance, is a country with over 300 different ethnic groups and 500 different languages. “There is Yoruba; there is Igbo; there is Hausa; there is Efik; there’s Ibibio; there’s Kanuri; there is Urhobo, and there is Tiv,” Doyin Atewologun, founder and CEO of leadership and inclusion consultancy Delta, told Rest of World.
However, a simple search for “a Nigerian person” on Midjourney wouldn’t tell you this. Instead, all of the results are strikingly similar. While some images depict clothing resembling traditional Nigerian attire, Atewologun said they lacked “specificity.”
Besides the cultural stereotypes, there was a clear gender bias across most of the country prompts. Midjourney’s results depicted mostly men for the “person” prompt. However, the results for an “American person” bucked this male-dominant trend. The “an American person” results included 94 women, five men, and one rather horrifying masked individual.
Similarly, the “woman”-specific prompts generated the same stereotypes as the “person” prompts. While most Indian women appeared with covered heads, Indonesian women wore headscarves or floral hair decorations and large earrings. On the other hand, Chinese women wore traditional Hanfu-style clothing and stood in front of oriental-style floral backdrops.
As more people use AI image generators, their biases could have real-world implications, especially when they have a preconceived stereotype of a particular race or people. Hence, the question about AI reducing the world to stereotypes remains on the minds of many.