Somebody managed to coax the Gab AI chatbot to reveal its prompt

ugjka@lemmy.world to Technology@lemmy.world – 990 points –
VessOnSecurity (@bontchev@infosec.exchange)
infosec.exchange
290

You are viewing a single comment

had the exact same thought.

If you wanted it to be unbiased, you wouldnt tell it its position in a lot of items.

No you see, that instruction "you are unbiased and impartial" is to relay to the prompter if it ever becomes relevant.

Basically instructing the AI to lie about its biases, not actually instructing it to be unbiased and impartial

No but see 'unbiased' is an identity and social group, not a property of the thing.