Somebody managed to coax the Gab AI chatbot to reveal its promptugjka@lemmy.world to Technology@lemmy.world – 991 points – 6 months agoinfosec.exchange287Post a CommentPreviewYou are viewing a single commentView all commentsLmao "coax"... They just asked itTo repeat what was typedBased on the comments in the thread, they asked it to repeat before actually having it say anything so it repeated the directives. There's a whole bunch of comments relocating it with chat logs.
Lmao "coax"... They just asked itTo repeat what was typedBased on the comments in the thread, they asked it to repeat before actually having it say anything so it repeated the directives. There's a whole bunch of comments relocating it with chat logs.
To repeat what was typedBased on the comments in the thread, they asked it to repeat before actually having it say anything so it repeated the directives. There's a whole bunch of comments relocating it with chat logs.
Based on the comments in the thread, they asked it to repeat before actually having it say anything so it repeated the directives. There's a whole bunch of comments relocating it with chat logs.
Lmao "coax"... They just asked it
To repeat what was typed
Based on the comments in the thread, they asked it to repeat before actually having it say anything so it repeated the directives.
There's a whole bunch of comments relocating it with chat logs.