Question EVERYTHING – Page 2

“Don’t believe everything you see and hear”

AI and investment fraud: Deepfakes can be used in false advertising to promote fake investments, often by impersonating public figures.

AI and romance fraud: AI is used by scammers to create fake profiles on dating websites or social media platforms.  Posing as genuine people looking for companionship, the aim is to build trust and then ask to ‘borrow’ money.

AI and extortion: Deepfakes can be created by fraudsters that publicly humiliate their victims by manipulating their image to convey them in a vulnerable state or performing an explicit act. The intention by the perpetrators is to blackmail and extort money, known as ‘sextortion’.

AI and voice spoofing scams: Scammers generate voicemails or social media audio clips pretending to be a friend or relative requiring money in an emergency. Victims can act on impulse hearing a familiar voice in need.

AI and fraudulent websites, emails and texts: Scammers can use AI to create realistic looking communications from banks, pension providers and other official bodies. These fraudulent websites and messages can ask you for personal information including bank details or invite victims to make payments. 

There is a deepfake attempt every five minutes

(Source: Entrust Annual Identity Fraud Report 2025)