Skip to main content

By

Yvonne van Bokhoven

Published on

January 30, 2025

As a global marketing agency, we’re always experimenting with new AI tools. The capabilities of some of these tools are astonishing and evolving fast.


There are now several that allow you to create video from a static image. One of the best is Runway. This is easy to use with plain English instructions and has a number of capabilities.

One of the concerning features though, is the way it interprets male and female images. You can try this yourself by uploading a static image then instructing the software to ‘walk towards the camera’. On a male, it turns it into a full-body video of a man walking straight to the camera. If you use a female image, however it enlarges the breasts, changes the dress and makes the woman walk provocatively as if they were at a fashion show (at least it did for me). Do they know something I don’t?

You can see for yourself. This is a standard shot of me:

And this is what it turned me into.

I can tell you, I look nothing like this output and I don’t walk like that.

A problem at scale

In fairness, Runway is not alone in this. AI image generators like DALL-E, Stable Diffusion, and Midjourney have also been criticized for sexualizing women. When users input neutral prompts like “CEO” or “doctor,” AI-generated images often depict men in suits, whereas “secretary” or “nurse” prompts often produce sexualized women. Even when explicitly asked for “a professional woman,” some AI tools default to unrealistic and revealing outfits.

There is also a proliferation of AI-powered tools that allow users to create fake sexualized images of women. An AI-powered app called DeepNude allowed users to create fake images of women using deep learning algorithms. The app was trained on thousands of nude images and it worked by mapping a clothed woman’s body to a corresponding nude one. The developers shut it down after a backlash, but the technology had already been leaked and repurposed by others.

AI-driven video game characters and digital assistants like Siri, Alexa and Cortana often default to female voices and personas. This just reinforces the stereotype of women as submissive or service-oriented. In some cases, AI-generated female avatars in games, VR, and AR are designed with exaggerated, hyper-sexualized body proportions. Deepfake pornography is a growing issue where AI-generated videos place women’s faces onto explicit content without consent. High-profile female celebrities like Emma Watson, Scarlett Johansson and Taylor Swift have been victims of AI deepfakes.

What should be done

So what should be done to mitigate this? The answer is first to be aware of it. Then we need legislators and tech companies to introduce and enforce laws against AI-generated non-consensual content.

Many countries, including the UK and the US, are working to criminalize AI-generated non-consensual pornography. Developers must work on gender-balanced datasets to prevent AI from reinforcing harmful stereotypes. Platforms using AI-generated content (like Instagram, TikTok and AI art tools) need stronger content moderation to prevent bias. Encouraging AI ethics guidelines can help prevent the sexualization of women in digital spaces.

Sadly, the trend seems to be moving in the opposite direction. Meta has dropped its content fact-checking early this month. So the proliferation of fake content will only worsen. And as the tools improve and produce more authentic output, it will become harder and harder to distinguish fake from real. But then maybe that’s what they want.

It feels like all the hard work that has been put in by men and women over the past decades towards equality, is now going to be undone. To help fight this trend, we have embarked on a research study that investigates the issue of gender bias caused by AI. The findings will be announced on International Women’s Day so watch this space.