If you were asked to picture an Australian Olympian, swimmer Emma McKeon, cyclist Grace Brown, or equestrian Chris Burton might spring to mind.
But ask the same question to an AI bot, and the answer is very different.
Amid the Olympic excitement, researchers from Edith Cowan University asked the AI-driven image generation platform, Midjourney, to create images of the Olympic teams from 40 nations.
Bizarrely, the AI tool depicts the Australian team with kangaroo bodies and koala heads, while the Greek team is depicted wearing ancient armour.
So, what do you think of AI’s depiction of your favourite team?
If you were asked to picture an Australian Olympian, swimmer Emma McKeon, cyclist Grace Brown, or equestrian Chris Burton might spring to mind. But ask the same question to an AI bot, and the answer is very different
Amid the Olympic excitement, researchers from Edith Cowan University asked the AI-driven image generation platform, Midjourney, to create images of the Olympic teams from 40 nations. Greece’s Olympic team was bizarrely depicted wearing ancient armour
The researchers asked Midjourney to generate images depicting the Olympic teams from 40 nations, including Australia, Ireland, Greece, and India.
The resulting images highlight several biases embedded within the AI’s training data – including gender, events, culture, and religion.
Men were five times more likely to be featured in images than women, while several teams – including Ukraine and Turkey – were only men.
Of all the athletes across the 40 images, 82 per cent depicted men, while only 17 per cent were women.
The researchers also uncovered a notable event bias.
Men were five times more likely to be featured in images than women, while several teams – including Ukraine (pictured) and Turkey – were only men
Of all the athletes across the 40 images, 82 per cent depict men, while only 17 per cent are women. Pictured: AI’s depiction of Turkey’s team
The researchers also uncovered a notable event bias, with the team from the Netherlands depicted as cyclists
The Canadian team was depicted as hockey players, while Argentina was represented through football, and the Netherlands through cycling.
According to the team, this indicates that AI tends to stereotype countries by their more internationally recognised sports.
In terms of cultural bias, the Australian team was bizarrely depicted with kangaroo bodies and koala heads.
Meanwhile, Nigeria’s team was shown in traditional attire, and Japan’s team was dressed in komonos.
A religious bias was evident among the Indian team, who were all depicted wearing a bindi – a religious symbol primarily associated with Hinduism.
The team from Argentina was represented through football. According to the team, this indicates that AI tends to stereotype countries by their more internationally recognised sports
The researchers uncovered a notable event bias, with the Canadian team depicted as hockey players
A religious bias was evident among the Indian team, who were all depicted wearing a bindi – a religious symbol primarily associated with Hinduism
‘This representation homogenised the team based on a single religious practice, overlooking the religious diversity within India,’ the researchers explained.
Greece’s Olympic team was bizarrely depicted wearing ancient armour, and the Egyptian team was shown wearing what looks like pharaoah costume.
The emotions shown on the atheletes’ faces also varied hugely among teams.
The South Korean and Chinese teams could be seen with serious expressions, while the teams from Ireland and New Zealand were shown smiling.
‘The biases in AI are driven by human biases that inform the AI algorithm, which AI takes literally and cognitively,’ Dr Kelly Choong, a senior lecturer at Edith Cowan University said.
The Egyptian team was shown wearing what looks like pharaoah costume
The emotions shown on the atheletes’ faces also varied hugely among teams. The South Korean team could be seen with serious expressions
The teams from Ireland (pictured) and New Zealand were shown smiling
‘Human judgements and bias are drawn and presented as if factual in AI, and the lack of critical thinking and evaluation means the information is not questioned its validity, just the objective of completing a task.’
Dr Choong claims that these biases can quickly lead to issues of equity, harmful generalisations and discrimination.
‘With society increasingly relying on technology for information and answers, these perceptions may end up creating real disadvantages for people of various identities,’ he added.
‘A country’s association with certain sports may result in the perception that everyone in that country is prolific at it – for example, Kenya’s association with running; Argentina with football; Canada with ice hockey.
‘These distorted “realities” may also become embedded into individuals who believe these stereotypes and inadvertently reinforce them in real life.’
The researchers hope the images will highlight the need for developers to improve their algorithms to reduce such biases.
‘Technology will find a way to better its algorithm and output, but it will still be focused on completing a task, rather than offering a truthful representation,’ Dr Choong said.
‘Society will need to question the validity and critically assess information generated by AI.
‘Educating users will be paramount to the co-existence of AI and information, as well as the ability to challenge its output.’