AI models collapse when trained on recursively generated datafloofloof@lemmy.ca to Technology@lemmy.world – 247 points – 2 months agonature.com35Post a CommentPreviewYou are viewing a single commentView all commentsShow the parent commentvery unlikely to stem from model collapse. why would they use a worse model? it's probably because they neutered it or gave it less resources.It learns from your own code as you type so it can offer more relevant suggestions unlike the web-based LLMs. So you can make it feed back on itself.Where did you learn to write such shitty code? I learned it from watching you!
very unlikely to stem from model collapse. why would they use a worse model? it's probably because they neutered it or gave it less resources.It learns from your own code as you type so it can offer more relevant suggestions unlike the web-based LLMs. So you can make it feed back on itself.Where did you learn to write such shitty code? I learned it from watching you!
It learns from your own code as you type so it can offer more relevant suggestions unlike the web-based LLMs. So you can make it feed back on itself.Where did you learn to write such shitty code? I learned it from watching you!
very unlikely to stem from model collapse. why would they use a worse model? it's probably because they neutered it or gave it less resources.
It learns from your own code as you type so it can offer more relevant suggestions unlike the web-based LLMs. So you can make it feed back on itself.
Where did you learn to write such shitty code?
I learned it from watching you!