Question EVERYTHING – Page 4

Deepfake videos: Be vigilant

Verify authenticity of any famous person advertisement

Be wary of investments schemes that are promoted by any famous person as these deepfakes are designed to look real. Research independently and check if the financial services firm is authorised by the Financial Conduct Authority (FCA) for the services being offered and that the contact details match those listed on the FCA Firm Checker: fca.org.uk/consumers/fca-firm-checker

Top 3 famous people currently impersonated by fraudsters:
Martin Lewis
Elon Musk
Richard Branson

(Source: Action Fraud, November 2025)

Double check legitimacy of any video

Scammers can create hyper-realistic video of ‘family members’ or ‘friends’ in distress to manipulate people into sending money. Pause, think and verify the communication independently through a trusted separate channel to confirm the validity of the request.

Voice spoofing: Listen carefully

Test if the voice is from a real person

If you are being contacted from a ‘loved one’ asking for money in an emergency, call them back directly on their known number to confirm that it really is them. 

Alternatively, you could ask the voice on the end of the phone for exacting details of a shared memory to see if they know the answer or set up a family code word to verify whether the person is really who they say they are. 

Criminals can clone a person’s voice from as little as 3 seconds of audio and manipulate it to say whatever they want.

You must never prompt a caller by giving any personal information away. Scammers can mask the number they’re calling from and can even appear to be calling from your friend or family member’s number, so beware.

More than 1 in 4 (28%) of UK adults think they have been the target of an AI voice cloning scam in the last year

(Source: Mortar research for Starling Bank, August 2024)