California bill would require ‘watermarks’ to signal content created by AI

chagall@lemmy.world to Technology@lemmy.world – 266 points –
sfchronicle.com

Alternative link: https://archive.is/qgEzK

37

You are viewing a single comment

Only gonna make things more difficult for good actors while doing absolutely nothing to bad actors

That's true, but it would be nice to have codified way of applying a watermark denoting AI. I'm not say the government of CA is the best consortium, but laws are one way to get a standard.

If a compliant watermarker is then baked into the programs designed for good actors, that's a start.

It would be as practical for good actors to simply state an image is generated in its caption, citation, or some other preexisting method. Good actors will retransmit this information, while bad actors will omit it, just like they’d remove the watermark. At least this way, no special software is required for the average person to check if an image is generated.

Bing Image Creator already implements watermarks but it is trivially easy for me to download an image I generated, remove the watermark, and proceed with my ruining of democracy /s

I wasn't thinking of like a watermark that is like anyone's signature. More of a crypto signature most users couldn't detect. Not a watermark that could be removed with visual effects. Something most people don't know is there, like a printer's signature for anti-counterfeiting.

I don't want to use the word blockchain, but some kind of way that if you want to take a fake video created by someone else, you are going to have a serious math problem on your hands to take away the fingerprints of AI. That way any viral video of unknown origin can easily be determined to be AI without any "look at the hands arguments".

I'm just saying, a solution only for good guys isn't always worthless. I don't actually think what I'm saying is too feasible. (Especially as written.) Sometimes rules for good guys only isn't always about taking away freedom, but to normalize discourse. Although, my argument is not particularly good here, as this is a CA law, not a standard. I would like the issue at least discussed at a joint AI consortium.

If your plan requires good actors to put in extra effort, it's a bad plan

How in the world would this make anything more difficult for good actors?