AI models collapse when trained on recursively generated data

floofloof@lemmy.ca to Technology@lemmy.world – 247 points –
AI models collapse when trained on recursively generated data - Nature
nature.com
34

You are viewing a single comment

Yep. It leads to a positive feedback loop. They just continue to self-reinforce whatever came out before.

And with increasing amounts of the internet being polluted with AI text output....

... AI inbreeding.

We call it the GRRM model.

In the USA, they call it the AlaLlama model.

What about the Grrr! model after that astoundingly XD So Random! thing from Invader Zim?

He's an android or robot, right?

To be fair this doesn't sound much different than your average human using the internet.

2024, Reverse Turing Test Challenge:

Can an LLM AI differentiate between human input and LLM AI input?

Well... Its built on statistics and statistical inference will return to the mean eventually. If all it ever gets to train on is closer and closer to the mean, there will be nothing left to work with. It will all be the average...