Deepfake voices improving kidnapping scams
cnn.com
"A reasonably good clone can be created with under a minute of audio and some are claiming that even a few seconds may be enough." Mom wary about answering calls for fear voice will be cloned for future virtual kidnapping.
Unless I know who you are, I'm not answering your call. 90% of the time it's a robot and the other 10% can leave a voicemail.
Isn't a voicemail worse for detecting deepfakes because it doesn't require it to dynamically listen and respond?
I'm not personally concerned about getting duped by a deep fake. I just don't want to talk to any robots, solicitors, or scammers.
I have had calls from similar numbers to my own, and seen caller ID’s for people that aren’t contacts. I haven’t picked them up, but the temptation was there to do so.
I'll go one farther - unless it's my doc, my wife, or my boss, I'm neither answering the call nor listening to the voicemail. That's what easily skimmable voicemail transcription is for...
I don't love the privacy implications of transcribed voicemail, ofc, but it's better for my own privacy/threat model than answering the phone to robots, scammers, and etc. It's also a hell of a lot better for my mental health, vs listening to them.
You might know the number. My wife used to live in Kenya and renamed her "Mum"/"Dad" contacts after they once got a call from her stolen phone saying she'd been arrested and they needed to send money for bail.
Same. Life is much better this way
Real kidnappers will not be happy about this as deepfake becomes more prevalent and calls for ransom gets ignored more and more. Do they have union that can go on strike to raise awareness about this?
The real victims here
"Improving" is not a word I would use in this case.
Like that old Invader Zim line.
“You made the fires worse!”
“Worse? Or better?”
Right? Reads like… hey kidnappers, parents hate this one simple trick
This is why code words are important.
As awful as it sounds this needs to be setup between family members. Agree on a phrase or code word to check and make sure they are who they say they are. This is already common when it comes to alarm system monitoring companies, got to make sure the intruder isn't the one answering the phone and telling them false alarm.
Techbros be like "but what if it can be used to resurrect dead actors?".
Not even necessarily dead actors. They used AI to bring young Luke Skywalker back in The Book of Boba Fett. And it was not great, but it was serviceable. Now give it 10 years.
Ohh, well that totally makes all this worthwhile then.
Sorry, I wasn't trying to suggest that at all. I think it's appalling.
The 'hostage-taker' will never be able to duplicate my family's grammar and sentence structure quirks, so I won't care how it "sounds"...
I'm pro-AI but any technology that can lead to the creation of deepfakes must be explicitly banned.
Naturally, we're already talking about criminals but you combat this issue the same way you combat school shootings. Banning the root of the issue and actively persecuting anyone who dares acquire it illegally.
EDIT: The victims wouldn't have fallen to this deepfake scam if they had their own deepfake scam. Scam the criminal before he scams you!!!!!
Is this what gaslighting is?
Perhaps there should be government controlled licenses for some technologies, like for gun ownership? Although there's probably all sorts of ways that be circumvented. Not sure how best to control this though.
*edit wow thanks, nice to know what sort of community this is. Nothing in the responses so far have told me anything I didn't already know, and I did point out that it would be circumvented. I don't see any other ideas though. Maybe with that sort of negative defeatist response we'll never even try. Fuck it right, let's just watch the world burn. /s
Ah yes because making something illegal stops criminals from using it. Problem solved.
Basically impossible.
It's against the ToS to use tools like teamviewer to run user support scams, for example, but people do it anyway. You can't legislate against criminals, because if they're already breaking the law, why would they care about another law?
The only way forward here is enforcement. There needs to be better coordination between governments to track down and prosecute those running the scams. There's been a lot of pressure on India, for example, to clean up their act with their very lax cybercrime enforcement, but it's very much an uphill battle.
Not really comparable to guns, making it harder to get a physical object is much different from preventing people from downloading software. Even 3d printed guns require equipment and knowledge to make use of the download.