Who is really calling? The rise of AI voice cloning scams.

Amanda Lee
Senior Program Manager, Tech for Good & TELUS Wise®

When was the last time you posted a video on social media? Innocent enough, right? You just wanted family and friends to see the great hike you did, the amazing meal you ate or the birthday candles your son blew out. But with the rise of AI voice cloning and associated scams, those innocent videos provide just what bad actors need to fool the people you care about most.
What is AI voice cloning?
AI voice cloning is also known as synthetic or deepfake audio. It happens when people use AI to recreate a person’s voice convincingly. They typically find an original voice imprint in videos on YouTube or from social media posts. Bad actors have embraced AI voice cloning to take their scams to a more personal level.
How prevalent are AI voice scams? In its Q4 2024 Global Call Threat Report, Hiya, a global leader in AI-powered voice intelligence, found that one third of survey respondents across the U.S., UK, Canada, Germany, France and Spain encountered deepfake voice fraud in 2024, and 30% fell victim to the scam.
Hiya also found that AI-generated deepfake fraud calls resulted in individual losses exceeding $6,000. In the Canadian context, the research indicated that 27% of Canadian consumers had reported experiencing deepfake fraud calls, with average reported losses totalling $1,479.
Real-life stories
When AI voice cloning first hit the scene, the “grandparent” scam was most common. Targeting seniors, scammers would clone the voices of grandchildren and make distress calls to grandparents.
Marilyn Crawford received one such call. Early one morning, someone pretending to be a local police officer called to inform her that her grandson was in custody for stealing a car. At one point in the call, her “grandson” (the likeness of his voice at least) came on the line pleading with her for help. It was only going to cost $9,000 to release him. The scammer even sent a taxi to take her to the bank! Thankfully, the teller at the bank became suspicious upon hearing the story and saved Marilyn from the scam.
With the mainstream adoption of AI, scammers have gotten even more crafty and ambitious. In June 2025, The Canadian Anti-Fraud Centre (CAFC), in partnership with the Canadian Centre for Cybersecurity (CCCS), warned about a campaign impersonating high-profile public figures.
Scammers were leaving AI-generated voice messages mimicking the voices of senior officials and prominent public figures. The messages were urgent requests for money or sensitive information. The targets were business executives and senior public officials.
How to spot and stop the scam
AI voice cloning is convincing. It’s meant to be. But there are some fairly obvious signs that the call you’re getting isn’t a friend or loved one on the other end of the line.
- The caller is informing you about an emergency situation that seems out of character for that person (e.g., arrested for stealing a car, detained for texting while driving and causing an accident, unexpected health emergency).
- The request (typically for money) is urgent and sometimes threatening – you must act right away or else something worse may happen.
- There are inconsistencies in the person’s voice or tone that seem odd, or you hear unexpected background noises.
- The person making the call asks you to pay money anonymously via e-transfer or crypto currency.
What steps can you take to protect yourself and your family? There are few strategies you can adopt to help avoid falling victim to AI voice cloning scams.
- Take a breath: these calls are meant to elicit an intense emotional reaction (and rightly so if they were real). If you do get an emergency call from a loved one, pause for a minute to gather your thoughts and respond instead of reacting.
- Ask questions: verify that the person calling is who they say they are. Ask them something personal only they would know (a family member’s phone number, a friend’s birthday, the last trip you took as a family, their favourite bedtime book when they were young).
- Reach out to the person: call or text your loved one to make sure they are ok. If they answer, you know it’s a scam.
- Have open conversations: talk with your family about how scammers are using AI voice cloning and provide real-life examples they can relate to.
- Establish a family plan: choose a unique word or phrase that everyone can remember and that only your family would know. If anyone gets a strange, urgent emergency call, this becomes an easy way to verify whether it’s real or fake.
- Limit social media sharing: scammers pull voice clips from what they can find publicly. Limit how much video or audio you share on social media and lock it down to close friends and family (for example, choose close friends only when sharing Instagram stories).
AI voice scams are becoming more frequent and convincing. Recognizing how to spot them and following some simple safety strategies can make these scam attempts more a nuisance than a hazard. As AI becomes more and more mainstream, looking out for our loved ones, staying informed and trusting our guts will remain the things that keep us safe and help us enjoy the benefits of this new technology era.