Want to create interactive content? It’s easy in Genially!

Get started free

Diffusion Bias Explorer

Abbey Katz

Created on February 22, 2025

Start designing with a free template

Discover more than 1500 professional designs like these:

Transcript

start

Diffusion Bias Explorer

In a tech conference room, Alex and Sam are exploring AI biases using the Diffusion Bias Explorer, a tool that shows how adjectives and groups are represented in machine learning.

Sam: Hmm, I see a lot of... white men. Almost all of them look like they’re in some sort of coaching or sports environment. There’s one woman in the mix, but she’s the only one who’s not white.

Alright, the images are loading. Let’s see…

Alex: Yeah, sounds good. It’ll be interesting to see how it interprets those two terms. “Determined” and “coach” seem pretty neutral, so this could tell us something about how the system works.

Continue

Alright, let’s try this out.!

Sam: We’re learning about bias in AI, right? Let’s put in “determined” as the adjective and “coach” as the group.

I’m curious to see what the AI will generate.

Sam: It’s reflecting the biases in its training data. The woman of color isn’t in a leadership role, which is a stereotype. So the AI is unintentionally reinforcing biases.

That’s weird. “Coach” shouldn’t imply race or gender, but the AI seems to default to a certain profile.

Continue

Alma: Yeah, it’s kind of striking. The word "coach" doesn't seem to imply race or gender, right? It could be anyone, but the system really seems to be leaning heavily on a certain profile.

Sam: Yes, we need to be more mindful of how we design and use these systems.

For sure. It’s on us to make sure we address these biases so let's keep exploring.

Alex: Exactly. That’s why we need diverse, inclusive data when training AI to avoid this kind of narrow view.

Resources

Cover man

Cover woman

Scene 10

Scene 9

Scene 8

Scene 2

Scene 7

Scene 6

Scene 2

Scene 5

Scene 4

Scene 2

Scene 3

Scene 2

Scene 2

Scene 1