A study published in the prestigious journal Nature highlights a discriminatory and largely unrecognized digital representation bias.
The analysis reveals that on the internet, including in artificial intelligence systems, women are invariably perceived and represented as younger than men, contradicting real-life demographic data.
This massive distortion has been documented through the analysis of nearly 1.4 million images and videos from major platforms (Google, Wikipedia, YouTube, Flickr), as well as the examination of nine large language models. The Yiaho team looked into this topic and we’re sharing the most important findings from the study:
An artificial age gap in images and videos
The researchers analyzed over 1.3 million images related to 3,495 social categories (professions, roles). Regardless of the platform or evaluation method (human judgment, algorithms, or verified biographical data), the finding is the same: women always appear younger.
On Google Images, the gap averages 0.37 age category for neutral searches. The gap rises to 0.71 age category on Wikipedia.
Editor’s note: Clarification: The “age category” unit is a relative measure based on age ranges classified by coders. A gap of 0.37 means that the average age attributed to men is significantly higher.
The study shows that this gap is particularly pronounced for high-status, high-income professions. Professions considered more prestigious or associated with higher median salaries have a significantly greater likelihood of men being represented as older than women.
Also read: Can ChatGPT lie?
A demographic contradiction
The U.S. census indicates that there is no systematic age difference between men and women in the workforce. For a decade, the proportion of women in a profession has not been correlated with the median age of that profession.
Yet online images sometimes reverse this reality: in sales or human resources sectors, Google Images shows men as older than women, while official U.S. data indicates the opposite.
Language models reproduce the distortion
The analysis of nine large language models (including GPT) confirms this association: the more masculine a social category, the more it is linked to an advanced age. This bias is found in billions of words, regardless of textual data sources (Reddit, Google News, Wikipedia, Twitter).
The specific case of ChatGPT
During recruitment process simulations: By generating nearly 40,000 resumes, ChatGPT automatically assigned female candidates an age 1.6 years younger and 0.92 years less experience compared to male candidates.
Worse, when evaluating resumes, ChatGPT assigned the highest scores to older men. Women remained largely disadvantaged compared to senior men.
The algorithm amplifies the bias and impacts hiring
The study shows that mainstream algorithms don’t just reflect this bias, they actively amplify it.
Impact on perception (Google Images): An experiment showed that seeing an image of a woman for a profession on Google Images led participants to estimate the average age of that profession 5.46 years younger than if they had seen a man. This effect exceeds participants’ initial prejudices.
Impact on decisions (Hiring): Participants also reported a preference for hiring younger in professions they associated with women, and older in those they associated with men. This age-gender bias directly influences aptitude judgments.
An intersectional bias to correct
This study documents an intersectional age-gender bias that spans all digital formats. The amplification of this bias by algorithms risks worsening inequalities: women are confined to roles perceived as “young” and less experienced, while positions of authority remain associated with older, more experienced men.
The authors call for developing methods to actively detect and correct these multidimensional distortions, which are harder to spot than isolated discrimination.
Source: Nature.com


