you answer as follows Random calls from family members breathlessly explaining how the horrific car accident happened. I need to send money now. Otherwise you will go to jail. The desperation in their voices is palpable as they demand immediate cash handouts. It certainly sounds like them and the call is from their number, but something feels off. So I decided to hang up and call back immediately. When your family answers your phone call, they say there was no accident and they don't understand what you're talking about.
congratulations. Successfully avoided artificial intelligence scam calls.
As generative AI tools improve, it's becoming easier and cheaper for fraudsters to create fake, but convincing sounds of people's voices. These AI voice clones are trained on existing audio clips of human voices and can be tuned to imitate just about anyone. The latest models can also speak in many languages. OpenAI, the maker of ChatGPT, recently announced a new text-to-speech model that further improves voice cloning and makes it more widely accessible.
Of course, bad actors are using these AI cloning tools to fool victims into thinking they're talking to a loved one on the phone, when they're actually talking to a computer. The threat of AI-powered fraud is scary, but keeping these expert tips in mind can help you stay safe the next time you receive an urgent and unexpected call.
Remember that AI voices are difficult to detect
It's not just OpenAI. Many technology startups are working on a near-perfect replica of the human voice, and recent advances have been rapid. Ben Colman, co-founder and CEO of Reality Defender, said: “A few months ago, we would have provided hints about what to look for, such as a pregnancy pause or some type of incubation period indication. “I guess so.” Like many aspects of generative AI over the past year, AI audio has become a more convincing imitation of the real thing. Safety strategies that rely on auditory detection of odd quirks on the phone are outdated.
hang up and call back
Security experts warn that it's very easy for scammers to make it look like a call is coming from a legitimate phone number. “Scammers often spoof the phone number to make it appear that they are calling from the government agency or bank,” said Michael Jabara, Visa's global head of fraud services. “You have to be proactive.” When you receive a call demanding money or personal information, whether it's from your bank or a loved one, ask them to call you back right away. Find the number online or in your contacts and start a follow-up conversation. You can also try sending the message through another verified means of communication, such as video chat or email.
Create a secret safeword
A common security tip suggested by multiple sources is to create a safe word to ask on the phone that only family members know. “You can also pre-negotiate words and phrases that your loved one can use to prove they're a real person if you find yourself in a duress situation,” said Steve Grobman, McAfee's chief technology officer. ”. It's best to call back or confirm with another method of communication, but safe words can be especially helpful for younger or older relatives who are difficult to reach in other ways.
Or ask them what they had for dinner.
What should you do if you don't know your safe word and are trying to determine if an alarming call is real? Pause for a moment and ask a personal question. “It can be as simple as asking a question that only your loved one knows the answer to,” Grobman says. “It might be, 'Hey, I want to confirm if this is really you.' Can you remember what you had for dinner last night?” , make sure your question is specific enough.
Understand that you can imitate any voice
Deepfake voice clones aren't just limited to celebrities and politicians, like the New Hampshire phone call that used AI tools to sound like Joe Biden to deter people from voting. “One of the misconceptions is, 'That's not going to happen to me.' No one can replicate my voice,” he said. says Rahul Sood, chief product officer at the company PinDrop. “What people don’t realize is that just 5-10 seconds of audio from your TikTok or work YouTube videos can be easily cloned using that content.” AI Using tools, a voicemail message sent to your smartphone may be enough to replicate your voice.
Don't give in to emotional appeals
Whether it's a pig butchering scam or an AI phone call, an experienced scammer can build trust, create a sense of urgency, and find weaknesses. “The best scammers aren't necessarily the most skilled technical hackers, so be careful with anything that gets your emotions running high,” Jabara says. “But they have a really good understanding of human behavior.” If you take a moment to reflect on the situation and avoid acting impulsively, chances are you won't be scammed that moment.