But then, the other day I was messing around with an image generation model and it took me way too long to realize that it was only generating East Asian-looking faces unless explicitly instructed not to.
Every model is going to have something as its "average case." If you want the model to generate something else you'll have to ask it.
Models generally trend towards one thing. Its hard to create a generalized model, from a mathematical standpoint. You just have to say what you want.
Yep, but you can't just say "diversity plz".
Matrix prompt with | white | black | brown | asian
Thats what pops into your head first?
It's one thing that strikes me as kinda odd.
But then, the other day I was messing around with an image generation model and it took me way too long to realize that it was only generating East Asian-looking faces unless explicitly instructed not to.
Every model is going to have something as its "average case." If you want the model to generate something else you'll have to ask it.
Models generally trend towards one thing. Its hard to create a generalized model, from a mathematical standpoint. You just have to say what you want.
Yep, but you can't just say "diversity plz".
Matrix prompt with | white | black | brown | asian