Somebody managed to coax the Gab AI chatbot to reveal its prompt

ugjka@lemmy.world to Technology@lemmy.world – 991 points –
VessOnSecurity (@bontchev@infosec.exchange)
infosec.exchange
287

You are viewing a single comment

Lmao "coax"... They just asked it

To repeat what was typed

Based on the comments in the thread, they asked it to repeat before actually having it say anything so it repeated the directives.

There's a whole bunch of comments relocating it with chat logs.