Skip to main content
    Deepfake Defense Hub

    AI Is Being Used Against You.
    Here's How to Fight Back.

    Voice clones. Fake videos. "Your grandchild is in jail" calls that sound exactly like her. The scams are already here — and the defense is simpler than you think. Start with a family password, today.

    THE BASICS

    What's a deepfake?

    A deepfake is AI-created video, audio, or imagery that looks and sounds real. In 2026 it takes seconds and costs almost nothing to make one — which means they are being used in scams against ordinary people right now.

    Who can make one

    Anyone with a laptop. Voice clones need as little as 3 seconds of audio from a voicemail or social media video. Face swaps and full video fakes take a few minutes on free or $10/mo websites.

    Who they are used against

    Not celebrities — you. Grandparents, parents of college students, small business owners wiring money, HR staff processing "urgent" requests from "the CEO." The FBI logged over $500 million in AI-assisted impersonation scams last year.

    Why this page exists

    You cannot stop deepfakes from being made. But you can train yourself and your family to not fall for them. Most of the defense is one habit, repeated consistently.

    KNOW YOUR ENEMY

    The three biggest threats

    Most deepfake crime falls into three buckets. If you know these three, you know the pattern when it lands.

    Voice cloning scams

    "Grandma, I am in trouble, please send money." The voice sounds exactly like your grandchild — because a 3-second Instagram clip was enough to clone it. The caller is crying, panicking, telling you not to tell Mom. It is NOT your grandchild.

    Video deepfakes

    A celebrity endorsing a miracle pill or crypto scheme they have never heard of. A politician saying something inflammatory they never said. Footage of "your boss" in a Zoom call asking you to wire money urgently.

    Image deepfakes

    "Proof" photos used in romance scams (a handsome stranger's face on a generic soldier photo). Fake IDs for verification fraud. Fake "nudes" used to extort teenagers and adults. None of them are real.

    DO THIS TODAY

    Family password system (most important)

    Set up a family password right now. A word or phrase only your family knows. When anyone calls claiming to be in trouble, ask for the password. Any voice can be cloned in seconds — this cannot.

    Pick a word or phrase

    Something memorable but not guessable from social media. Not a pet name. Not a hometown. Something silly: "purple dog," "Tuesday pancakes," "Aunt Linda's piano." Silly is better — harder to guess, easier to remember.

    Share it with everyone who matters

    Kids, parents, spouse, grandparents, close friends, anyone who might call you in a panic or be called about. Tell them in person or on a video call. Not in a text, not in an email.

    Keep it OFF the internet

    Never in a group chat. Never in an email. Never on social media. Never written down on a Post-it on your fridge where a house sitter could photograph it. It only works if scammers cannot find it.

    Use it EVERY time

    If someone calls in distress asking for money, ask for the password before you react. Real family will not be offended — they will be relieved you are careful. If the caller refuses, gets angry, or says "this is not the time" — hang up.

    Refresh it if you think it has leaked

    If someone outside the circle asks about it, or you discover it was in a hacked account, pick a new one. Takes 30 seconds.

    YOUR EYES

    How to spot video deepfakes

    Deepfake technology is improving fast, but most 2026 fakes still have visible giveaways if you slow down and look carefully.

    Blinking looks wrong

    Too much, too little, or oddly-timed. Real eye movement has subtle drift and micro-blinks; AI often gets this uncanny.

    Lips do not match audio perfectly

    Watch the mouth during 'b', 'p', and 'f' sounds — AI often lets lips stay open or closed at the wrong moment. Slow the video down to half speed if you can.

    Skin texture is too smooth or too perfect

    Real skin has pores, small reflections, slight asymmetry. Deepfake faces often look like they went through a beauty filter — airbrushed, waxy, or weirdly glowing.

    Lighting inconsistencies

    The face is lit from one direction, the neck from another. Shadows fall differently on the face than on the background. The person's glasses do not reflect the room they claim to be in.

    Slow Ken Burns zooms and static shots

    Many low-effort deepfakes hide flaws with very slow zooms on a single photo or stiff, locked-camera shots. Real videos have natural handheld motion or varied angles.

    Audio quality "convenient"

    Background is weirdly silent, or very noisy in a uniform way. Echo does not match the room. No natural breathing or mouth sounds between words.

    YOUR EARS

    How to spot voice clones

    Voice clones are harder to spot than video fakes — the tech is genuinely good. Rely less on how the voice sounds and more on how the call behaves.

    Urgency about money

    Needs cash right now. Cannot wait for the bank to open. Cannot call back in five minutes. Real family emergencies almost never require a wire transfer in the next 10 minutes.

    Will not answer personal questions

    Ask something only they would know — not something on Facebook. 'What did Aunt Linda make for Thanksgiving last year?' If there is stammering, deflection, or 'I cannot hear you right now,' hang up.

    Poor audio quality (often intentional)

    A muffled or crackly connection hides AI artifacts. "I am in a noisy place" or "this is a bad cell signal" is a red flag when combined with a money request.

    Refuses to switch to video

    Any excuse not to go to FaceTime, Google Meet, or Zoom. "My phone is broken." "I am at the police station." "They took my phone." Pressure for video is the single fastest way to unmask a scam.

    Asks for gift cards, wire transfers, or crypto

    Real emergencies are paid with credit cards or real bank transfers, which can be reversed. Gift cards, wires, and crypto cannot. These three payment methods are the universal scam signature.

    CONFIRM

    Verification techniques

    When your instincts say "something is off," these are the five moves that end the scam in under 5 minutes.

    Hang up and call back on a KNOWN number

    Not the number they called from. Not a number they give you. The number you already have in your phone for that person or institution. If it was real, they will pick up or call you right back.

    Ask specific personal questions

    Things that are not on Facebook. A shared inside joke. The name of the family dog that died ten years ago. The color of Grandma's kitchen wall. Scammers and AI will stumble.

    Switch to video call

    "I want to see your face. Let me FaceTime you." A real loved one will do it. A scammer will invent a reason why they cannot.

    Ask for the family password

    The single fastest test. If the answer is hesitation, anger, or a different word — it is a scam.

    Wait 15 minutes and reach out another way

    Text the person. Call their spouse. Message a friend who would know. If the story was real, 15 minutes does not change it. If it was fake, the scammer will have moved on to the next target.

    TECH YOU CAN USE

    Real tools to help

    None of these are magic, but used together they catch a lot of fakes before they land.

    Free deepfake-detection tool. Paste a video URL or upload a clip; it scans for known AI-generation fingerprints. Not perfect but a useful first check.

    Intel FakeCatcher

    Built into some Intel-powered browsers and platforms. Analyzes blood-flow signals in face pixels (real skin flushes subtly; AI faces do not) in real time.

    Paste any profile photo into TinEye or Google Lens. If it shows up as a stock photo, a model's portfolio, or someone else entirely — it is a fake identity.

    Works straight from a phone photo. Upload a screenshot of the suspicious person — if Lens finds the same face on other names, you have a scammer.

    Photo-verification service used by insurers and journalists. Its Lens app captures "certified" photos that can be verified as unedited — useful if you need to prove something is real.

    FIGHT BACK

    Reporting deepfakes

    Reporting takes 5–10 minutes. It helps investigators, and a surprising number of these cases do lead to arrests when patterns are pieced together.

    The Federal Trade Commission's front door for all US consumer fraud. Report deepfake scams, romance scams, and impersonation. Feeds directly to law enforcement.

    The FBI's Internet Crime Complaint Center. Use for any scam involving over $1,000, business email compromise, or wire transfers. Your report goes into the federal database used across investigations.

    Your state Attorney General

    Every state AG office has a consumer fraud reporting form online. Good for scams that cross state lines or involve a specific local business.

    Platform reporting (Instagram, Facebook, TikTok, X)

    Every major platform has a deepfake / impersonation / manipulated-media report button. Use it. Large volumes of reports move the needle on takedowns faster than a single one.

    If you sent money

    Call the sending bank or card within 24 hours — reversals are sometimes possible, especially on credit cards. Call local police for a report number (required by some insurance and tax claims). Change any compromised passwords immediately.

    SHRINK YOUR TARGET

    Protect your own voice and image

    The less public audio and video of you that exists, the harder you are to clone. You do not have to go dark — just dial back a little.

    Lock down social media

    Switch personal Instagram, Facebook, and TikTok to private. Profile photos can be public; videos, stories, and voice clips should be for followers only.

    Do not post voice messages publicly

    Public TikTok voiceovers, podcast clips, and Instagram stories with you talking — these are prime voice-clone training data. Keep spoken content to private accounts.

    Think twice before public video

    Especially in distress or emotional contexts. A clip of you crying at a wedding is perfect training data for a "your family member is in trouble" scam targeting your parents.

    Children especially

    Their voices and faces should not be on public accounts. Scammers target grandparents using cloned grandchild voices — the clip came from somewhere. Share in Apple Shared Albums, Google Photos shared albums, or private family messaging instead.

    Review what is already public

    Google your own name. Check old blogs, school sites, company "about" pages. Request takedowns where you can, especially of voice clips and public children's photos.

    KNOW THE SCRIPTS

    The scariest scam scripts (so you recognize them)

    Reading the exact wording of these scams is unsettling — but a parent who has read this list will recognize it in real time, and hang up.

    Kidnapping scam with cloned child voice

    You answer the phone. You hear your child screaming "Mom! Mom, help me!" A man comes on: "We have your daughter. Do not call police. Wire $5,000 now." The voice is real. The kidnapping is not. Call or text your child on their actual phone — they will answer.

    "Your boss is demanding a wire transfer"

    A voice that sounds like your CEO, or a Zoom video that looks like them, telling you to urgently wire funds to close a deal. Every real company has a wire-verification process — always, always use it, even when the boss seems impatient.

    "Your spouse was in an accident"

    A tearful voice claiming to be your spouse, a police officer, or an ER doctor asking you to pay a bail or fee before they can "release them." Real hospitals and police never demand payment by phone for release. Hang up. Call your spouse directly.

    Fake politician robocalls

    "This is [name] asking you not to vote on Tuesday." Voters in New Hampshire received cloned-voice calls in 2024. If a political call sounds real but the message is strange, it probably is not real.

    "Grandma, I got arrested"

    The classic. A crying voice that sounds exactly like your grandchild. "Please do not tell Mom. I need bail money. Buy $500 in Google Play cards and read me the numbers." No court, anywhere, takes gift cards.

    KEEP CURRENT

    Stay ahead

    Scams evolve. Following one or two of these voices in your feed is enough to stay a month ahead of what scammers are trying.

    The FTC's consumer alerts blog and @FTC on social. Clear writing, real case studies, no panic.

    Mark Cuban's warnings

    Cuban has been vocal about deepfake impersonations of himself on social. Following him on X or Instagram gets you concrete examples of what current scams look like.

    Aimed at seniors but useful at any age. Free helpline at 1-877-908-3360 if you think you or a family member is being targeted.

    Cybersecurity and Infrastructure Security Agency — federal alerts on active threats, especially business impersonation.

    DO THIS

    Your action plan

    Everything on this page condensed into three things you can actually finish this month.

    Today: Set up a family password

    Pick a silly, memorable word. Call or video your parents, kids, siblings. Make sure everyone knows it and understands when to use it. Takes 15 minutes. Saves careers and life savings.

    This week: Lock down social media

    Go to every account (yours, your kids', your parents') and switch personal content to private / friends-only. Remove old public voice clips. Ten minutes per account.

    This month: Train your family

    Share this page with parents, grandparents, and anyone over 60 in your life. Walk them through the three threats and the five verification steps. Run a "pretend scam" drill: "What would you do if I called in trouble right now?"

    Today. This week. This month.

    Today: Set up a family password.
    This week: Lock down social media.
    This month: Train your family.

    Deepfake Defense — Spot AI-Generated Scams | TekSure