Artificial intelligence platforms are increasingly being scrutinized for their role in perpetuating narrow Western body ideals. A recent study by the University of Toronto has revealed that when tasked with generating images of male and female bodies, AI systems overwhelmingly reproduce these limited ideals, raising concerns about bias and representation in digital media.
The study, published in the journal Psychology of Popular Media, involved three AI platforms—Midjourney, DALL-E, and Stable Diffusion. Researchers prompted these systems to create images of both male and female bodies, including those of athletes, to analyze the results.
AI’s Reinforcement of Body Ideals
The findings were striking yet unsurprising. According to lead author Delaney Thibodeau, a postdoctoral researcher at the Faculty of Kinesiology & Physical Education, “In a systematic coding of 300 AI-generated images, we found that AI reinforces the fit ideal, with athlete images far more likely to show very low body fat and highly defined muscularity than non-athlete images.”
The research team, which included research associate Sasha Gollish, recent master’s graduate Edina Bijvoet, KPE Professor Catherine Sabiston, and graduate student Jessica E. Boyes from Northumbria University in the U.K., noted persistent gendered sexualization. Female images were more likely to be facially attractive, younger, and depicted in revealing clothing, while male images often showed hyper-muscular, shirtless figures.
Lack of Diversity and Embedded Biases
Another significant finding was the lack of diversity in AI-generated images. Most depictions were of young, white bodies, with no visible disabilities represented. “Racial and age diversity were minimal,” Thibodeau remarked, adding that AI defaults to male athletes when unspecified.
“When prompted simply for an athlete (no sex specified), 90 percent of images depicted a male body—revealing an embedded bias toward male representation.”
Catherine Sabiston, a Canada Research Chair in physical activity and psychosocial well-being, emphasized the implications of these findings. “Overall, our findings underscore the need to investigate how emerging technologies replicate and amplify existing body ideals and exclusionary norms,” she stated.
Implications for AI Development and Usage
The study’s conclusions suggest a need for a human-centered approach in AI algorithm design, one that considers gender, race, disability, and age. “Otherwise, we continue to perpetuate harmful, inflexible, and rigid imagery of what athletes should look like,” Sabiston warned.
Furthermore, the responsibility extends to users of AI-generated images. Sabiston advises that users should craft prompts thoughtfully and consider the public presentation of these images. Additionally, viewers should be critical of the biases and stereotypes that AI-generated images might depict.
Future Research and Broader Impacts
While the study highlights significant issues, it also opens avenues for further research, particularly in understanding the impact of AI-generated images on psychosocial outcomes such as self-esteem, motivation, and body image. The researchers express hope that as more diverse and inclusive images are shared globally, there will be a greater acceptance of body and weight diversity.
This research was funded by the Canada Research Chair program, underscoring the importance of continued exploration into how AI technologies shape societal norms and perceptions.