Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation

misk@sopuli.xyz to Technology@lemmy.world – 880 points –
Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
404media.co
218

You are viewing a single comment

“Forever is banned”
Me who went to college

Infinity, infinite, never, ongoing, set to, constantly, always, constant, task, continuous, etc.

OpenAi better open a dictionary and start writing.

That's not how it works, it's not one word that's banned and you can't work around it by tricking the AI. Once it starts to repeat a response, it'll stop and give a warning.

Then don’t make it repeated and command it to make new words.

Yes, if you don't perform the attack it's not a service violation.

1 more...
1 more...
3 more...