4 Ways to Mitigate Bias in AI and Close the Diversity Deficit

Key lessons from a 2024 Cannes activation

We deliver! Get curated industry news straight to your inbox. Subscribe to Adweek newsletters.

Feed a prompt to an AI image generator and you’re bound to encounter an insidious pattern: Do the people look … too stunning? Perhaps even wanton? 

Gender, race, body type, nationality, religion—you’re almost guaranteed to get prejudiced and outdated stereotypes when using these descriptors in prompts. And “wanton” is a deliberate adjective; it’s mostly used pejoratively toward women, and AI tends to oversexualize female images. These glaring imbalances showcase a recurring problem with AI outputs: the replication of societal biases, which can be harmful to actual people and communities. 

I wrestled with this firsthand while helping develop Sir Martian, one of our key AI demos featured at Cannes earlier this year.