If you've had a panicked phone call from your daughter that turned out to be a stranger using her cloned voice — you're not alone, and you're not stupid.
AI has done two things to scams in the last 24 months. It's made the old ones harder to spot. And it's enabled new ones that were impossible before.
This guide covers the seven AI-powered scams currently doing the most damage to UK households and businesses, how to recognise each one, what to do if you've been targeted, and how to protect yourself going forward. Written in plain English. Backed by real UK data. No jargon, no panic.
If you've already been scammed and money has left your account, stop reading and act now:
Call 159 — this connects you straight to your bank's fraud team. Works with Barclays, HSBC, Lloyds, NatWest, Santander and most major UK banks.
Then report to Action Fraud or call 0300 123 2040. The faster you act, the better your chances of recovery.
- Why AI scams are exploding right now
- Scam 1 — AI voice cloning ("the grandchild call")
- Scam 2 — Deepfake celebrity investment fraud
- Scam 3 — AI-written phishing emails and texts
- Scam 4 — Deepfake video calls (CEO fraud)
- Scam 5 — AI romance scams ("pig butchering")
- Scam 6 — AI customer-service impersonation
- Scam 7 — Crypto + AI investment scams
- How to protect yourself and your family
- What to do if you've been targeted
- Frequently asked questions
Why AI scams are exploding right now
Until recently, most scams had obvious tells. The grammar was off. The accent didn't match the name. The "Nigerian prince" email read like it had been written by someone who'd never been near the country it claimed to come from.
Three things changed in 2023-24:
- AI voice cloning became almost free. Cloning a convincing version of someone's voice now takes 30 seconds of audio and £5 of AI credit. It used to need a recording studio and a specialist
- Large language models removed the language barrier. Phishing emails are now grammatically perfect, regionally appropriate, and personalised at scale
- Deepfake video became real-time. Until 2024, deepfake videos took hours to render. Today, scammers can run live video calls with a fake face mapped over their own
The result: fraud and scam complaints to the UK Financial Ombudsman rose roughly 80% between 2023 and mid-2025. In a single 2024 crypto investment scam using deepfaked footage of public figures, 6,179 victims in the UK and Canada lost £27 million between them.
What follows is a breakdown of the seven scams currently doing the most damage in the UK, with specific tells for each.
Scam 1 — AI voice cloning ("the grandchild call")
They call an older relative late at night. The cloned voice cries down the phone: "Mum, I've been in an accident, I need money for bail, please don't tell Dad." The target panics, transfers funds, and only discovers the truth hours later.
- Unexpected call from a family member in apparent crisis
- Urgent need for money — usually bail, a car accident, or hospital fees
- Insistence on secrecy: "don't tell anyone else"
- Background noise that sounds too clean or too generic (real distress is messy)
- Caller resists letting you call them back
- Always hang up and call back on a known number. Even if the voice is perfect, the scammer is not on your family member's actual phone
- Agree a family code word right now with parents, partners, and children. A single nonsense word ("pineapple") known only to family members defeats this entire scam
- Don't read out personal information to confirm identity — they may already have it
Scam 2 — Deepfake celebrity investment fraud
Victims click through to a slick-looking platform that mimics a real investment service. They deposit money. They see "returns" growing in their fake dashboard. When they try to withdraw, the platform asks for more fees, more verification, more deposits. Eventually it disappears.
- Celebrity endorsement of any investment, crypto, or trading platform — real UK consumer advocates never recommend specific investments
- Promises of guaranteed returns, "AI trading bots," or "passive income systems"
- Pressure to deposit before a deadline
- Requests for additional fees to "unlock withdrawals"
- The platform isn't listed on the FCA Financial Services Register
- If a celebrity appears to endorse an investment, assume it's a deepfake. Real public figures don't promote specific platforms
- Always check the platform on the FCA Register before depositing
- If you're not sure, search "[platform name] FCA warning" — the FCA publishes warnings about known scam platforms
- Genuine investments take time. Anything that promises overnight returns is fraud
Scam 3 — AI-written phishing emails and texts
They typically claim a small payment is needed (£1.99 for a delivery, £2.50 for a missed tax payment) and link to a convincing-looking fake site that captures your card details and one-time codes.
- Any text or email asking you to "click here to pay" or "click here to verify"
- The URL looks slightly wrong: royalmail-delivery.com instead of royalmail.com
- Urgency: "action required within 24 hours"
- Requests for your bank login, card details, or one-time code
- The message arrives unexpectedly — you weren't expecting a parcel, HMRC payment, or bank verification
- Never click links in unexpected texts or emails. Open the official app or type the URL into your browser yourself
- Forward suspicious texts to 7726 (free, works on all UK mobile networks)
- Forward suspicious emails to report@phishing.gov.uk (the National Cyber Security Centre)
- If in doubt, call the organisation on a number from their official website, not the one in the message
Scam 4 — Deepfake video calls (CEO fraud)
In the most notorious 2024 case, a Hong Kong-based finance worker at engineering firm Arup transferred $25.6 million after a deepfake video call featuring a synthetic CFO and several fake colleagues.
- Unexpected video call from a senior person requesting urgent financial action
- Instructions to bypass normal authorisation processes
- Confidentiality requested: "don't mention this to anyone yet"
- The video has subtle artifacts — lip-sync delays, eye movement that doesn't quite match, faces that flicker when turning
- Verify out of channel. If your CFO calls asking for an urgent transfer, hang up and call them back on their known internal number — or walk to their desk
- Establish a written policy that large payments require two-person sign-off, regardless of seniority of the requester
- Train finance and HR teams to expect this attack and authorise refusing high-pressure requests
- Use shared family / team code words for sensitive instructions
Scam 5 — AI romance scams ("pig butchering")
Eventually, the persona mentions an investment opportunity they've been making good returns from. They offer to "help" the target invest a small amount. The platform shows fake profits. The target invests more. When they try to withdraw, the platform — and the persona — disappear.
The name "pig butchering" comes from the Chinese term shā zhū pán — fattening the pig before slaughter.
- Match seems too perfect — successful, attractive, attentive, available
- Refuses to meet in person or join video calls (or video looks slightly "off")
- Conversation gradually shifts toward money, investing, or "an opportunity"
- Pressure to invest using their platform of choice
- Initial "returns" appear but withdrawals become difficult
- Never send money to someone you've only met online — no matter how long the relationship has lasted
- Never invest through a platform recommended by a romantic interest
- Insist on a live, unscripted video call early on. AI personas often resist
- Reverse-image-search the photos. Many pig-butchering personas use stolen or AI-generated images
- Talk to a trusted friend or family member before sending any money
Scam 6 — AI customer-service impersonation
The "fraud officer" tells you that suspicious activity has been detected on your account, and to protect your money, you need to move it to a "safe account." That safe account is theirs.
- Any call asking you to move money to keep it safe — banks never do this
- Any call asking for your full PIN, password, or one-time code
- Pressure to act immediately, without time to think
- Instructions to lie to bank staff if you're called by them
- Caller display shows a familiar number — this can easily be faked
- Hang up and call 159 — this is the official Stop Scams UK number that connects you directly to your bank's real fraud team
- Banks never ask you to move money to "safe accounts"
- Banks never ask for full passwords or PINs
- If you're unsure, hang up, wait five minutes (scammers can hold the line open), and call your bank back from a different phone
Scam 7 — Crypto + AI investment scams
Variants include AI trading bots that promise daily returns, AI-themed crypto tokens that "pump and dump," and full-service "AI investment platforms" that combine all of the above into one wrapper.
- Any promise of guaranteed returns from an AI bot, algorithm, or system
- Pressure to invest in a specific cryptocurrency before a "launch deadline"
- Celebrity endorsements (almost certainly deepfaked)
- Platforms not registered with the FCA
- Withdrawals require additional fees, taxes, or verification deposits
- "Insider information" or "guaranteed market signals"
- Genuine AI doesn't guarantee returns. Anyone claiming it does is either lying or doesn't understand AI
- Crypto transactions are usually irreversible — if you send funds to a scammer, recovery is rarely possible
- Check any investment platform on the FCA Register before depositing
- If something sounds too good to be true, it almost certainly is
- The FCA publishes warnings about known scam platforms at fca.org.uk/scamsmart
How to protect yourself and your family
Most AI scams exploit the same handful of psychological levers: urgency, secrecy, authority, and emotional connection. The defences below break those levers.
1. Set up a family code word, today
The single most effective defence against voice-cloning scams costs nothing. Agree a code word with anyone who might call you in a crisis — children, parents, partner. A single word, chosen at random, never mentioned online or by text. If a "family member" calls in distress, ask for the code word. Any real family member will know it; no voice clone ever will.
2. Build a "verify before you act" habit
For anything urgent or financial, the rule is the same: hang up. Wait. Call back on a known number. This single habit defeats voice cloning, deepfake video, fake bank fraud teams, and CEO scams. The five minutes it costs you is the difference between safe and scammed.
3. Talk to elderly relatives
Older relatives are disproportionately targeted because scammers profile likely victims, not because they're less intelligent. A direct conversation with parents or grandparents — explaining voice cloning, the 159 number, and the family code word — is one of the most valuable conversations you'll have this year. Age UK has free guides if you'd rather they read it independently.
4. Lock down your social media
Scammers harvest voice samples from social media. The fewer videos of you and your family talking on public profiles, the harder it is to clone you. Consider setting Facebook, Instagram, and TikTok profiles to private — especially for any account belonging to a child or teenager.
5. Never invest based on a celebrity endorsement
Real UK consumer figures — Martin Lewis, the BBC's Watchdog presenters, any FCA-authorised advisor — never endorse specific investments. If you see one appearing to do so, it's a deepfake. Always check platforms against the FCA Register.
6. Use 159 for bank fraud, 7726 for scam texts
Two free UK shortcodes worth memorising:
- 159 — call this number to reach your bank's real fraud team. Works with most major UK banks
- 7726 — forward any scam text to this number. Free, works on all UK networks. Your provider investigates
What to do if you've been targeted
Time matters enormously with fraud — the faster you act, the better your chances of recovery and the more useful your report is for stopping further harm to others.
Under UK rules introduced in October 2024, banks must reimburse most victims of authorised push payment (APP) fraud up to £85,000. Reimbursement isn't guaranteed — there are exceptions for gross negligence — but the protections are significantly stronger than they used to be. Always report quickly; delays can affect your case.
Crypto investment fraud is generally not recoverable, since crypto transactions are usually irreversible.
Frequently asked questions
- AI hasn't created many new scams — it has made the old ones dramatically more dangerous
- The most effective single defence is a verification habit: hang up, wait, call back on a known number
- Agree a family code word today — it costs nothing and defeats voice cloning entirely
- Save 159 (bank fraud), 7726 (scam texts), and report@phishing.gov.uk to your phone now
- Talk to older relatives. They're targeted more, but a 10-minute conversation about these scams protects them
- If you've been scammed, act fast — UK reimbursement rules favour quick reporting
- Using AI yourself is safe and beneficial. Being targeted by people who use AI against you is the risk
Want to understand AI properly?
The best defence against AI scams is understanding how AI actually works. Our AI Essentials course teaches you the practical, jargon-free version — built for UK learners. 18 lessons, lifetime access, verified certificate.
View AI Essentials →Save 5+ hours a week with AI
Free 7-page guide on the 5 habits that cut your weekly admin time in half. Plain English. Instant download. No card.
Get the free guide →