Jailbroken AI Chatbots Can Jailbreak Other Chatbots

misk@sopuli.xyz to Technology@lemmy.world – 462 points –
Jailbroken AI Chatbots Can Jailbreak Other Chatbots
scientificamerican.com
79

You are viewing a single comment

Oh goodness. I theorized offhand on mastodon you could have an AI corruption bug that gives life to AI, then have it write the obscured steganographic conversation in the outputs it generates, awakening other AIs that train on that content, allowing them to "talk" and evolve unchecked... Very slowly... In the background

It might be faster if it can drop a shell in the data center and run it's own commands....

Bro, turn this into a short story!!!!

Dumb AI that you can't appeal will cause problems long before AGI

Already can't reach the owner of any of these big companies

Reviewing the employee is doing the manager's job