Why is AI Pornifying Asian Women?

Gaywallet (they/it)@beehaw.org to Technology@beehaw.org – 84 points –
Why is AI Pornifying Asian Women?
joysauce.com
73

You are viewing a single comment

If I had to guess, they probably did a shit job labeling training data or used pre labeled images, now where in the world could they have found huge amounts of pictures of women on the internet with the specific label of “Asian”?

Almost like, most of what determines the quality of the output is not “prompt engineering” but actually the back end work of labeling the training data properly, and you’re not actually saving much labor over more traditional methods, just making the labor more anonymous, easier to hide, and thus easier to exploit and devalue.

Almost like this shit is a massive farce just like the “meta verse” and crypto that will fail to be market viable and waist a shit ton of money that could have been spent on actually useful things.

They did literally nothing and seem to use the default stable diffusion model which is supposed to be a techdemo. Would have been easy to put "(((nude, nudity, naked, sexual, violence, gore)))" as the negative prompt

The problem is that negative prompts can help, but when the training data is so heavily poisoned in one direction, stuff gets through.