All women pictured are A.I. generatedcyu@sh.itjust.worksbanned from community to Technology@lemmy.world – 183 points – 1 years agofiles.catbox.moe77Post a CommentPreviewYou are viewing a single commentView all commentsThat's a lot of white chicks.Really it's like 6, copy pasted over and overYeah, a lot of the faces look very very similar!So?Points to a limited dataset including mostly white peopleWithout knowing the prompt vs the model, it's impossible to say which side of things is responsible for the lack of variety. Many modern models are actually very good at reversing sample biases in the training set.Good point
That's a lot of white chicks.Really it's like 6, copy pasted over and overYeah, a lot of the faces look very very similar!So?Points to a limited dataset including mostly white peopleWithout knowing the prompt vs the model, it's impossible to say which side of things is responsible for the lack of variety. Many modern models are actually very good at reversing sample biases in the training set.Good point
So?Points to a limited dataset including mostly white peopleWithout knowing the prompt vs the model, it's impossible to say which side of things is responsible for the lack of variety. Many modern models are actually very good at reversing sample biases in the training set.Good point
Points to a limited dataset including mostly white peopleWithout knowing the prompt vs the model, it's impossible to say which side of things is responsible for the lack of variety. Many modern models are actually very good at reversing sample biases in the training set.Good point
Without knowing the prompt vs the model, it's impossible to say which side of things is responsible for the lack of variety. Many modern models are actually very good at reversing sample biases in the training set.Good point
That's a lot of white chicks.
Really it's like 6, copy pasted over and over
Yeah, a lot of the faces look very very similar!
So?
Points to a limited dataset including mostly white people
Without knowing the prompt vs the model, it's impossible to say which side of things is responsible for the lack of variety.
Many modern models are actually very good at reversing sample biases in the training set.
Good point