Due to deepfakes and people getting really good at editing, there's only going to be a very brief period in human history where video could be trusted as evidence.

okbuddyretard@kbin.social to Shower Thoughts@kbin.social – 0 points –

title

2

And it's basically ended already. At least for ordinary people without a IT forensic team, the best advice is to be very sceptical towards images and videos, more or less so depending on the source.

Grain of salt, rather than skepticism.

No normal person can waste their time being truly skeptical of everything they see. The reality is we don't have to (nor often do we) believe everything we see in normal situations. So take everything you see online, on TV, from afar with a grain of salt. Don't put too much faith in it. Shit this is important even with actual real video footage, context can change everything. You get twenty seconds of a twenty minute video and you can make it say anything you want. Just cut the context to suit.

In situations where evidence counts, such as a court room, custody of that evidence is considered. Any old mp4 can't be provided to the court as evidence without it being thrown out by the other team. So there really, truly is hardly any concern whatsoever about generative AI in the courts.

And the other side of that coin is confirmation bias. I don't care how shitty the fake is, if you show a MAGA a video of Biden eating a baby that person will insist it's real. Against any evidence to the contrary, they'll argue its real and reality won't matter.

That's what's been meant by the "post truth world". It isn't a problem of establishing what is and isn't truth. The problem is that the truth doesn't matter anymore.