Is That Really Your Grandchild? The Dangers of AI Impersonation Scams
Published: January 16, 2025

Imagine getting a call from someone who sounds exactly like your grandchild. They’re crying, begging for help, and urgently need money for an unexpected crisis. It feels real—but it’s not. Advances in artificial intelligence (AI) are enabling criminals to clone voices with startling accuracy, turning innocent grandparents into victims of a highly convincing scam.
All it takes is a few seconds of audio—often captured from social media videos, voicemail greetings, or streamed events—for criminals to replicate a voice. With AI, scammers can mimic the tone, emotion, and speech patterns of your loved ones, making it nearly impossible to distinguish the real from the fake.
What Is the Grandparent Scam?
The grandparent scam has existed for decades, but AI voice cloning is taking it to a new level. Here’s how it works:
- Scammers use AI to clone the voice of a grandchild or relative.
- They create a fabricated crisis, such as being in jail, injured, or stranded, and claim they need money immediately.
- The voice sounds so real—often accompanied by tears or panic—that it bypasses logical thinking and triggers an emotional response.
The FBI has reported a sharp rise in these scams, with 2024 losses exceeding all previous years. Unfortunately, seniors are often the primary targets because scammers exploit their desire to help family members in distress.
Why Are These Scams So Effective?
AI voice cloning makes these scams feel personal. Here’s why they work:
- Emotional Manipulation: The fake urgency and emotional distress leave little time for critical thinking.
- Credible Details: Scammers use public information from social media to make their stories believable, such as mentioning specific family members or recent events.
- Pressure Tactics: They often demand secrecy, claiming “Don’t tell Mom and Dad,” to prevent the victim from verifying the story.
These tactics exploit trust and fear, leaving victims feeling panicked and obligated to act quickly.
How to Protect Yourself From AI Voice Cloning Scams
Here are practical steps you can take to protect yourself and your loved ones from falling victim to these scams:
1. Set Up a Family Code Word
Create a “safe word” with your family that only you and your loved ones know. If you receive a distressing call, ask for the code word. If the caller can’t provide it, it’s likely a scam.
2. Limit Personal Information Online
Be cautious about what you share on social media. For example:
- Avoid posting videos with your voice or tagging family members.
- Review your privacy settings to limit who can see your content.
3. Be Skeptical of Urgent Requests
If you receive a call claiming to be from a family member in trouble:
- Stay calm. Panic can cloud judgment.
- Hang up and contact the person directly using a trusted number.
- Verify their story with other family members.
4. Switch to Automated Voicemail
Replace personal voicemail greetings with generic, pre-recorded messages from your phone provider. This prevents scammers from using your voice to create impersonations.
5. Watch for Red Flags
Even with AI, cloned voices may have slight inconsistencies. Look out for:
- Lack of natural pauses or emotional nuance.
- Strange background noises or robotic undertones.
- Requests for payment through non-traditional methods, such as gift cards or cryptocurrency.
Looking Ahead: Protect Your Family's Financial Security
Protecting yourself from scams is an important step, but planning for the future is equally vital. With the largest wealth transfer in history underway, understanding how to build, protect, and distribute wealth is essential.
Join us for our upcoming webinar:
The Great Wealth Transfer: How to Pass and Receive Wealth
Wednesday, January 22 | 6 PM ET
Learn key strategies for building, protecting, and transferring wealth, organizing your financial plans, and making informed decisions when passing or receiving an inheritance.
Register Now
Article content is provided for information purposes only.