All women pictured are A.I. generated

cyu@sh.itjust.worksbanned from community to Technology@lemmy.world – 183 points –
files.catbox.moe
77

Oh, is that why none of them are black?

Or even average looking.

In using Stable Diffusion for a DnD related project, I've found that it's actually weirdly hard to get it to generate people (of either sex) that aren't attractive - I wonder if it's a bias in the training materials, or a deliberate bias introduced into the models because most people want attractive people in their AI pics

It's trained on professionally taken photos. Professional photographers tend to prefer taking photos of attractive subjects.

Yeah all of my professional photographers hate me.

That's true, but it's not like ugly people don't get photographed - ultimately a professional photographer is going to take photos of whoever pays them to do so. That explanation accounts for part of the bias I think, but not all of it

If I would get pictures taken by a photographer I would not allow them to be used as training data. I don't even like looking into a mirror. Maybe that's part of why there are less ugly people pictures to train with.

I would guess that ugly people are less likely to commission photos.

They were created via a prompt, that prompt probably included some tags to make them more attractive. It's often standard practice to put tags like "ugly" and "deformed" into the negative prompts just to keep the hands and facial features from going wonky.

There are no elderly women, no female toddlers, and so forth either. Presumably just not what whoever generated this was going for. You can get those from many AI models if you want them.

Battleship coordinates, (B10). Also (I4) looks a lot like my niece. I really think it depends on your definition of "average" though. But as @fubo indicated. There are 0% black people in this photo. There's some vaguely Asian, roughly Middle Eastern looking, sort of South American, and whatever that is going on in (M8). But there are distinctly zero black people pictured.

Now I'm curious. What's average looking to you if none of these aren't?

Older and fatter for a start.

Are you american?

These people are (almost) all young, attractive and have flawless skin

Thats what pops into your head first?

It's one thing that strikes me as kinda odd.

But then, the other day I was messing around with an image generation model and it took me way too long to realize that it was only generating East Asian-looking faces unless explicitly instructed not to.

Every model is going to have something as its "average case." If you want the model to generate something else you'll have to ask it.

Models generally trend towards one thing. Its hard to create a generalized model, from a mathematical standpoint. You just have to say what you want.

Yep, but you can't just say "diversity plz".

Matrix prompt with | white | black | brown | asian

1 more...
3 more...

That's a lot of white chicks.

So?

Points to a limited dataset including mostly white people

Without knowing the prompt vs the model, it's impossible to say which side of things is responsible for the lack of variety.

Many modern models are actually very good at reversing sample biases in the training set.

They mostly all look like that one girl from that show, whatever it was.

My god, soon they will be making them nsfw!

They what!? Oh my

Can you imagine this technology being used to impersonate a president? It would be bedlam, but fortunately there are no girl presidents!

I believe they are already doing that to make fake nude leaks for celebrities. It'll be scary.

Impressive and a bit horrifying. Still, it seems to have a big problem recreating hands. A dating profile with a set of profile pics that doesn't feature a hand will now become suspect... until it cracks hands, too.

That is some how cool and creepy all at once. I guess it gives a macro-view of what we consider attractive.

Not really. OP doesn't mention how they were generated, and how the pictures were selected.

It could just reflect the kinds of photos that went into the image generator, and the biases that were in its training data, unless the photos were chosen based on an attractiveness survey or some such.

Most of these look like the same person with different hair and makeup, so it could just be a very specialized network.

It gives a view of what whoever generated these images considers attractive.

A16, C7, still not quite there with hands. đŸ˜Ŧ

Finally, some positive representation in media for people with diverse bodies / disability! /s

There's a ton of fucked up hands in there. It also seems to struggle with handbags. It's kinda fun to try to find one that isn't flawed in some way, I haven't found one that isn't.

This is wild.

Is there a source or article to read or something?

There is that one brunette on the right, reminds me of a movie/video I saw where the same AI woman kept popping up.

Tell me they didn't allow American source material without telling me they didn't allow American source material.