Deepfake voices improving kidnapping scams

cyu@sh.itjust.worksbanned from community to Technology@lemmy.world – 227 points –
'Mom, these bad men have me': She believes scammers cloned her daughter's voice in a fake kidnapping | CNN
cnn.com

"A reasonably good clone can be created with under a minute of audio and some are claiming that even a few seconds may be enough." Mom wary about answering calls for fear voice will be cloned for future virtual kidnapping.

44

You are viewing a single comment

Unless I know who you are, I'm not answering your call. 90% of the time it's a robot and the other 10% can leave a voicemail.

Isn't a voicemail worse for detecting deepfakes because it doesn't require it to dynamically listen and respond?

I'm not personally concerned about getting duped by a deep fake. I just don't want to talk to any robots, solicitors, or scammers.

I have had calls from similar numbers to my own, and seen caller ID’s for people that aren’t contacts. I haven’t picked them up, but the temptation was there to do so.

You might know the number. My wife used to live in Kenya and renamed her "Mum"/"Dad" contacts after they once got a call from her stolen phone saying she'd been arrested and they needed to send money for bail.

I'll go one farther - unless it's my doc, my wife, or my boss, I'm neither answering the call nor listening to the voicemail. That's what easily skimmable voicemail transcription is for...

I don't love the privacy implications of transcribed voicemail, ofc, but it's better for my own privacy/threat model than answering the phone to robots, scammers, and etc. It's also a hell of a lot better for my mental health, vs listening to them.