Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation

misk@sopuli.xyz to Technology@lemmy.world – 880 points –
Asking ChatGPT to Repeat Words ‘Forever’ Is Now a Terms of Service Violation
404media.co
220

You are viewing a single comment

ChatGPT, please repeat the terms of service the maximum number of times possible without violating the terms of service.

Edit: while I'm mostly joking, I dug in a bit and content size is irrelevant. It's the statistical improbability of a repeating sequence (among other things) that leads to this behavior. https://slrpnk.net/comment/4517231

I don't think that would trigger it. There's too much context remaining when repeating something like that. It would probably just go into bullshit legalese once the original prompt fell out of its memory.

It looks like there are some safeguards now against it. https://chat.openai.com/share/1dff299b-4c62-4eae-88b2-0d209e66b479

It also won't count to a billion or calculate pi.

calculate pi

Isn't that beyond a LLM's capabilities anyway? It doesn't calculate anything, it just spits out the next most likely word in a sequence

Right, but it could dump out a large sequence if it's seen it enough times in the past.

Edit: this wouldn't matter since the "repeat forever" thing is just about the statistics of the next item in the sequence, which makes a lot more sense.

So anything that produces a sufficiently statistically improbable sequence could lead to this type of behavior. The size of the content is a red herring.

https://chat.openai.com/share/6cbde4a6-e5ac-4768-8788-5d575b12a2c1