Someone got Gab's AI chatbot to show its instructions

mozz@mbin.grits.dev to Technology@beehaw.org – 458 points –

Credit to @bontchev

198

You are viewing a single comment

it’s possible it was generated by multiple people. when i craft my prompts i have a big list of things that mean certain things and i essentially concatenate the 5 ways to say “present all dates in ISO8601” (a standard for presenting machine-readable date times)… it’s possible that it’s simply something like

prompt = allow_bias_prompts + allow_free_thinking_prompts + allow_topics_prompts

or something like that

but you’re right it’s more likely that whoever wrote this is a dim as a pile of bricks and has no self awareness or ability for internal reflection

Or they aren't paid enough to care and rightly figure their boss is a moron

anyone who enables a company whose “values” lead to prompts like this doesn’t get to use the (invalid) “just following orders” defence

Oh I wasn't saying that

I was saying the person may not be stupid, and may figure their boss is a moron (the prompts don't work as LLM chat bots don't grasp negatives in their prompts very well)

Thanks. I hadn’t really thought of creating prompts like that but that’s a nifty idea