If you've had a panicked phone call from your daughter that turned out to be a stranger using her cloned voice — you're not alone, and you're not stupid.

AI has done two things to scams in the last 24 months. It's made the old ones harder to spot. And it's enabled new ones that were impossible before.

This guide covers the seven AI-powered scams currently doing the most damage to UK households and businesses, how to recognise each one, what to do if you've been targeted, and how to protect yourself going forward. Written in plain English. Backed by real UK data. No jargon, no panic.

£27m
Lost by 6,179 people in the UK and Canada in a single 2024 crypto deepfake scam
Source: Surfshark, SQ Magazine
+80%
Rise in UK fraud and scam complaints to the Financial Ombudsman between 2023 and mid-2025
Source: Financial Ombudsman Service
£250m
UK government investment over three years in the Fraud Strategy 2026-2029, published March 2026
Source: gov.uk Fraud Strategy
1,633%
Increase in deepfake voice-call (vishing) attacks in Q1 2025 versus the previous quarter
Source: Industry security reports
Important

If you've already been scammed and money has left your account, stop reading and act now:

Call 159 — this connects you straight to your bank's fraud team. Works with Barclays, HSBC, Lloyds, NatWest, Santander and most major UK banks.

Then report to Action Fraud or call 0300 123 2040. The faster you act, the better your chances of recovery.

Why AI scams are exploding right now

Until recently, most scams had obvious tells. The grammar was off. The accent didn't match the name. The "Nigerian prince" email read like it had been written by someone who'd never been near the country it claimed to come from.

Three things changed in 2023-24:

  • AI voice cloning became almost free. Cloning a convincing version of someone's voice now takes 30 seconds of audio and £5 of AI credit. It used to need a recording studio and a specialist
  • Large language models removed the language barrier. Phishing emails are now grammatically perfect, regionally appropriate, and personalised at scale
  • Deepfake video became real-time. Until 2024, deepfake videos took hours to render. Today, scammers can run live video calls with a fake face mapped over their own

The result: fraud and scam complaints to the UK Financial Ombudsman rose roughly 80% between 2023 and mid-2025. In a single 2024 crypto investment scam using deepfaked footage of public figures, 6,179 victims in the UK and Canada lost £27 million between them.

What follows is a breakdown of the seven scams currently doing the most damage in the UK, with specific tells for each.

Scam 1 — AI voice cloning ("the grandchild call")

Scam 1
Voice cloning — the most emotionally devastating UK scam right now
How it works
Scammers find a 10-30 second clip of a family member's voice — usually scraped from social media, a podcast appearance, or a voicemail greeting. They feed it into an AI voice cloning tool. The tool then lets them type any sentence and have it spoken in that family member's voice, in real-time, with full emotional range.

They call an older relative late at night. The cloned voice cries down the phone: "Mum, I've been in an accident, I need money for bail, please don't tell Dad." The target panics, transfers funds, and only discovers the truth hours later.
Red flags
  • Unexpected call from a family member in apparent crisis
  • Urgent need for money — usually bail, a car accident, or hospital fees
  • Insistence on secrecy: "don't tell anyone else"
  • Background noise that sounds too clean or too generic (real distress is messy)
  • Caller resists letting you call them back
How to defend yourself
  • Always hang up and call back on a known number. Even if the voice is perfect, the scammer is not on your family member's actual phone
  • Agree a family code word right now with parents, partners, and children. A single nonsense word ("pineapple") known only to family members defeats this entire scam
  • Don't read out personal information to confirm identity — they may already have it

Scam 2 — Deepfake celebrity investment fraud

Scam 2
"Martin Lewis told me to invest in this..."
How it works
Scammers create deepfake videos of trusted UK personalities — Martin Lewis, Holly Willoughby, Jeremy Clarkson, Elon Musk — appearing to endorse an "incredible new investment opportunity." The videos are posted as paid ads on Facebook, Instagram, TikTok, and YouTube.

Victims click through to a slick-looking platform that mimics a real investment service. They deposit money. They see "returns" growing in their fake dashboard. When they try to withdraw, the platform asks for more fees, more verification, more deposits. Eventually it disappears.
Red flags
  • Celebrity endorsement of any investment, crypto, or trading platform — real UK consumer advocates never recommend specific investments
  • Promises of guaranteed returns, "AI trading bots," or "passive income systems"
  • Pressure to deposit before a deadline
  • Requests for additional fees to "unlock withdrawals"
  • The platform isn't listed on the FCA Financial Services Register
How to defend yourself
  • If a celebrity appears to endorse an investment, assume it's a deepfake. Real public figures don't promote specific platforms
  • Always check the platform on the FCA Register before depositing
  • If you're not sure, search "[platform name] FCA warning" — the FCA publishes warnings about known scam platforms
  • Genuine investments take time. Anything that promises overnight returns is fraud

Scam 3 — AI-written phishing emails and texts

Scam 3
The grammatically perfect "your parcel is delayed" text
How it works
Phishing scams used to give themselves away with bad grammar and odd phrasing. AI has eliminated those tells. Modern phishing texts and emails — purporting to be from Royal Mail, HMRC, your bank, DVLA, your energy supplier — are now indistinguishable in language quality from the real thing.

They typically claim a small payment is needed (£1.99 for a delivery, £2.50 for a missed tax payment) and link to a convincing-looking fake site that captures your card details and one-time codes.
Red flags
  • Any text or email asking you to "click here to pay" or "click here to verify"
  • The URL looks slightly wrong: royalmail-delivery.com instead of royalmail.com
  • Urgency: "action required within 24 hours"
  • Requests for your bank login, card details, or one-time code
  • The message arrives unexpectedly — you weren't expecting a parcel, HMRC payment, or bank verification
How to defend yourself
  • Never click links in unexpected texts or emails. Open the official app or type the URL into your browser yourself
  • Forward suspicious texts to 7726 (free, works on all UK mobile networks)
  • Forward suspicious emails to report@phishing.gov.uk (the National Cyber Security Centre)
  • If in doubt, call the organisation on a number from their official website, not the one in the message

Scam 4 — Deepfake video calls (CEO fraud)

Scam 4
The fake CFO on a Zoom call asking for an urgent wire transfer
How it works
Aimed primarily at businesses. A junior finance employee receives an unexpected video call from their CFO or CEO — except the executive's face and voice are AI-generated in real time. The "executive" instructs them to make an urgent wire transfer for a confidential acquisition or supplier payment. The employee, seeing a live face and hearing a live voice, complies.

In the most notorious 2024 case, a Hong Kong-based finance worker at engineering firm Arup transferred $25.6 million after a deepfake video call featuring a synthetic CFO and several fake colleagues.
Red flags
  • Unexpected video call from a senior person requesting urgent financial action
  • Instructions to bypass normal authorisation processes
  • Confidentiality requested: "don't mention this to anyone yet"
  • The video has subtle artifacts — lip-sync delays, eye movement that doesn't quite match, faces that flicker when turning
How to defend yourself
  • Verify out of channel. If your CFO calls asking for an urgent transfer, hang up and call them back on their known internal number — or walk to their desk
  • Establish a written policy that large payments require two-person sign-off, regardless of seniority of the requester
  • Train finance and HR teams to expect this attack and authorise refusing high-pressure requests
  • Use shared family / team code words for sensitive instructions

Scam 5 — AI romance scams ("pig butchering")

Scam 5
The dating-app match who slowly steers you toward "investing"
How it works
A persona — increasingly maintained by AI rather than a human scammer — matches with a target on a dating app, social media, or even LinkedIn. The "relationship" builds over weeks or months. The conversation feels natural and emotionally rewarding. The persona is attractive, attentive, successful.

Eventually, the persona mentions an investment opportunity they've been making good returns from. They offer to "help" the target invest a small amount. The platform shows fake profits. The target invests more. When they try to withdraw, the platform — and the persona — disappear.

The name "pig butchering" comes from the Chinese term shā zhū pán — fattening the pig before slaughter.
Red flags
  • Match seems too perfect — successful, attractive, attentive, available
  • Refuses to meet in person or join video calls (or video looks slightly "off")
  • Conversation gradually shifts toward money, investing, or "an opportunity"
  • Pressure to invest using their platform of choice
  • Initial "returns" appear but withdrawals become difficult
How to defend yourself
  • Never send money to someone you've only met online — no matter how long the relationship has lasted
  • Never invest through a platform recommended by a romantic interest
  • Insist on a live, unscripted video call early on. AI personas often resist
  • Reverse-image-search the photos. Many pig-butchering personas use stolen or AI-generated images
  • Talk to a trusted friend or family member before sending any money

Scam 6 — AI customer-service impersonation

Scam 6
The "fraud team" call that comes from your real bank's number
How it works
You receive a call from what appears to be your bank's fraud team. The caller display shows your bank's real number (spoofed). The voice is calm, professional, and uses your bank's terminology correctly — increasingly because it's AI-generated and trained on real banking scripts.

The "fraud officer" tells you that suspicious activity has been detected on your account, and to protect your money, you need to move it to a "safe account." That safe account is theirs.
Red flags
  • Any call asking you to move money to keep it safe — banks never do this
  • Any call asking for your full PIN, password, or one-time code
  • Pressure to act immediately, without time to think
  • Instructions to lie to bank staff if you're called by them
  • Caller display shows a familiar number — this can easily be faked
How to defend yourself
  • Hang up and call 159 — this is the official Stop Scams UK number that connects you directly to your bank's real fraud team
  • Banks never ask you to move money to "safe accounts"
  • Banks never ask for full passwords or PINs
  • If you're unsure, hang up, wait five minutes (scammers can hold the line open), and call your bank back from a different phone

Scam 7 — Crypto + AI investment scams

Scam 7
"This AI-powered trading bot guarantees 5% per day"
How it works
The most lucrative AI scam category for criminals. Crypto + AI scams combine three forces: the technical complexity of crypto (which most victims don't fully understand), the perceived magic of AI (which scammers exploit as a mystique), and the irreversibility of crypto transactions (which makes recovery nearly impossible).

Variants include AI trading bots that promise daily returns, AI-themed crypto tokens that "pump and dump," and full-service "AI investment platforms" that combine all of the above into one wrapper.
Red flags
  • Any promise of guaranteed returns from an AI bot, algorithm, or system
  • Pressure to invest in a specific cryptocurrency before a "launch deadline"
  • Celebrity endorsements (almost certainly deepfaked)
  • Platforms not registered with the FCA
  • Withdrawals require additional fees, taxes, or verification deposits
  • "Insider information" or "guaranteed market signals"
How to defend yourself
  • Genuine AI doesn't guarantee returns. Anyone claiming it does is either lying or doesn't understand AI
  • Crypto transactions are usually irreversible — if you send funds to a scammer, recovery is rarely possible
  • Check any investment platform on the FCA Register before depositing
  • If something sounds too good to be true, it almost certainly is
  • The FCA publishes warnings about known scam platforms at fca.org.uk/scamsmart

How to protect yourself and your family

Most AI scams exploit the same handful of psychological levers: urgency, secrecy, authority, and emotional connection. The defences below break those levers.

1. Set up a family code word, today

The single most effective defence against voice-cloning scams costs nothing. Agree a code word with anyone who might call you in a crisis — children, parents, partner. A single word, chosen at random, never mentioned online or by text. If a "family member" calls in distress, ask for the code word. Any real family member will know it; no voice clone ever will.

2. Build a "verify before you act" habit

For anything urgent or financial, the rule is the same: hang up. Wait. Call back on a known number. This single habit defeats voice cloning, deepfake video, fake bank fraud teams, and CEO scams. The five minutes it costs you is the difference between safe and scammed.

3. Talk to elderly relatives

Older relatives are disproportionately targeted because scammers profile likely victims, not because they're less intelligent. A direct conversation with parents or grandparents — explaining voice cloning, the 159 number, and the family code word — is one of the most valuable conversations you'll have this year. Age UK has free guides if you'd rather they read it independently.

4. Lock down your social media

Scammers harvest voice samples from social media. The fewer videos of you and your family talking on public profiles, the harder it is to clone you. Consider setting Facebook, Instagram, and TikTok profiles to private — especially for any account belonging to a child or teenager.

5. Never invest based on a celebrity endorsement

Real UK consumer figures — Martin Lewis, the BBC's Watchdog presenters, any FCA-authorised advisor — never endorse specific investments. If you see one appearing to do so, it's a deepfake. Always check platforms against the FCA Register.

6. Use 159 for bank fraud, 7726 for scam texts

Two free UK shortcodes worth memorising:

  • 159 — call this number to reach your bank's real fraud team. Works with most major UK banks
  • 7726 — forward any scam text to this number. Free, works on all UK networks. Your provider investigates

What to do if you've been targeted

Time matters enormously with fraud — the faster you act, the better your chances of recovery and the more useful your report is for stopping further harm to others.

Money has been taken
Call 159 immediately to reach your bank's fraud team. They can freeze the transaction and start an investigation. Then report to Action Fraud (see below).
Scam call or attempted scam
Report to Action Fraud at actionfraud.police.uk or call 0300 123 2040. (Note: from 2026 Action Fraud is being replaced with a new "Report Fraud" service.)
Scam text message
Forward to 7726. Free on all UK networks.
Scam email
Forward to report@phishing.gov.uk (the NCSC's suspicious email reporting service).
Scam website
Report to the NCSC at ncsc.gov.uk.
Identity theft
Contact Cifas for protective registration. Also notify your bank and the credit reference agencies (Experian, Equifax, TransUnion).
Immediate danger
Call 999. If a scammer is at your door, or you believe a crime is taking place right now.
Note on reimbursement

Under UK rules introduced in October 2024, banks must reimburse most victims of authorised push payment (APP) fraud up to £85,000. Reimbursement isn't guaranteed — there are exceptions for gross negligence — but the protections are significantly stronger than they used to be. Always report quickly; delays can affect your case.

Crypto investment fraud is generally not recoverable, since crypto transactions are usually irreversible.

Frequently asked questions

What is an AI scam?
An AI scam is any form of fraud that uses artificial intelligence to deceive its victim. The most common UK examples include voice cloning, deepfake video calls, AI-written phishing emails, AI-generated fake celebrity endorsements promoting investments, and romance scams using AI-generated personas. The technology isn't new — what's new is that it's now cheap, fast, and accessible to almost anyone with a laptop.
How do I report an AI scam in the UK?
Report AI scams to Action Fraud at actionfraud.police.uk or call 0300 123 2040. (Note: from 2026 Action Fraud is being replaced with a new "Report Fraud" service.) For scam texts, forward them to 7726. For scam emails, forward to report@phishing.gov.uk. If money has been taken, also call 159 to reach your bank's fraud team immediately.
Can I tell if a voice call is a deepfake?
Increasingly, no — high-quality AI voice clones can be indistinguishable from a real voice, especially in short emotional calls. The best defence is a verification habit: always hang up and call the person back on a known number before agreeing to anything urgent or financial. Family code words, agreed in advance, are also highly effective.
Are older people more at risk of AI scams?
Yes, but not for the reason most people think. Older people aren't less intelligent — they're more frequently targeted because scammers profile likely victims. AI scams particularly exploit the bonds of family and trust, which makes "grandparent scams" (a fake grandchild calling in distress) disproportionately effective. Everyone is at risk; older relatives just receive more targeting attempts.
If I'm a victim, can I get my money back?
It depends on the type of fraud. Under UK rules introduced in October 2024, banks must reimburse victims of authorised push payment (APP) fraud up to £85,000, with limited exceptions. Card fraud is also typically refundable. Crypto investment fraud is rarely refundable, since crypto transactions are usually irreversible. Always report the fraud quickly — delays can affect reimbursement.
Does AI itself enable these scams, or are people just calling old scams "AI"?
Both. Many scams that existed before AI are now far more effective because AI removes the language and quality barriers that used to expose them. A clumsy phishing email from 2018 might have given itself away with poor grammar — today's AI-written version reads flawlessly. Other scams (like real-time voice cloning) were genuinely impossible before AI. The honest answer is that AI hasn't invented many new scams, but it has made the old ones dramatically more dangerous.
Should I be afraid of using AI myself?
No. Using AI tools like ChatGPT, Claude, or Gemini for your work, personal projects, or learning is genuinely safe and overwhelmingly beneficial. The same technology that powers scams powers thousands of legitimate productivity gains. The risk isn't in using AI — it's in being targeted by people who use it against you. Learning how AI works actually makes you better at spotting AI scams.
The bottom line
  • AI hasn't created many new scams — it has made the old ones dramatically more dangerous
  • The most effective single defence is a verification habit: hang up, wait, call back on a known number
  • Agree a family code word today — it costs nothing and defeats voice cloning entirely
  • Save 159 (bank fraud), 7726 (scam texts), and report@phishing.gov.uk to your phone now
  • Talk to older relatives. They're targeted more, but a 10-minute conversation about these scams protects them
  • If you've been scammed, act fast — UK reimbursement rules favour quick reporting
  • Using AI yourself is safe and beneficial. Being targeted by people who use AI against you is the risk

Want to understand AI properly?

The best defence against AI scams is understanding how AI actually works. Our AI Essentials course teaches you the practical, jargon-free version — built for UK learners. 18 lessons, lifetime access, verified certificate.

View AI Essentials →
Free Download

Save 5+ hours a week with AI

Free 7-page guide on the 5 habits that cut your weekly admin time in half. Plain English. Instant download. No card.

Get the free guide →