The White House wants to 'cryptographically verify' videos of Joe Biden so viewers don't mistake them for AI deepfakes

L4sBot@lemmy.worldmod to Technology@lemmy.world – 636 points –
The White House wants to 'cryptographically verify' videos of Joe Biden so viewers don't mistake them for deepfakes
businessinsider.com

The White House wants to 'cryptographically verify' videos of Joe Biden so viewers don't mistake them for AI deepfakes::Biden's AI advisor Ben Buchanan said a method of clearly verifying White House releases is "in the works."

257

You are viewing a single comment

I'm sure they do. AI regulation probably would have helped with that. I feel like congress was busy with shit that doesn't affect anything.

I salute whoever has the challenge of explaining basic cryptography principles to Congress.

Might just as well show a dog a card trick.

That's why I feel like this idea is useless, even for the general population. Even with some sort of visual/audio based hashing, so that the hash is independant of minor changes like video resolution which don't change the content, and with major video sites implementing a way for the site to verify that hash matches one from a trustworthy keyserver equivalent...

The end result for anyone not downloading the videos and verifying it themselves is the equivalent of those old ”✅ safe ecommerce site, we swear" images. Any dedicated misinformation campaign will just fake it, and that will be enough for the people who would have believed the fake to begin with.

Should probably start out with the colour mixing one. That was very helpfull for me to figure out public key cryptography. The difficulty comes in when they feel like you are treating them like toddlers so they start behaving more like toddlers. (Which they are 99% if the time)

I see no difference between creating a fake video/image with AI and Adobe's packages. So to me this isn't an AI problem, it's a problem that should have been resolved a couple of decades ago.