Don't let scammers use face recognition to steal your identity : Internet Scambusters #1,107
Did you know that the data used to recognize your face as a security check when you sign on to bank and other accounts could also be used for identity theft?
A Chinese gang has already used fake facial recognition software to steal money from victims in some parts of Asia, and now there are fears the scam could be heading our way.
In this week's issue, we'll fill you in on what's happening and outline the steps you can take to avoid the crooks and the action you can take toward staying safe.
Let's get started…
How Your Face Could Launch Identity Theft
You probably don't think of your face as a collection of data when your peer at it in the mirror or a photo. But it is.
Cameras may record your facial image, but software can convert it into a stream of digits - representing biometric patterns - that can and is being used for security. And now scammers have joined the fray.
We already know how scammers use stolen photos in romance scams. But now evidence is starting to emerge about mobile apps that can use facial data to drain a bank account.
It's already appearing in parts of Asia. And you can expect to see it on our shores in the not-too-distant future.
What Is Facial Recognition?
We humans recognize each other using memory banks in our brains. Facial recognition software does the same, comparing what it sees with existing records.
It's one of the key biometric processes we wrote about in issue #866: How Safe Are Fingerprint and Facial Recognition Sign-ons? And it helps organizations and security services confirm we are who we say we are.
At least that's the theory. But it doesn't take much of a leap of imagination to see its potential for a scam. If someone can fake your face, they can pretend to be you. And that's exactly what's happening.
Facial Recognition Scams Are Real
A new report from Singapore-based cyber security firm Group-IB claims that malware that finds its way onto some mobile phones can steal facial data. It uses artificial intelligence (AI), the same that's used for creating deepfake images.
Victims are tricked into providing a face video and a copy of their ID card supposedly for a new security app. The image and info are then used to open or access victims' bank accounts. We won't explain how it's done but if you're interested you can read the full, widely-reported Group-IB research: Face Off: Group-IB Identifies First iOS Trojan Stealing Facial Recognition Data.
The real worry about this latest scam is that the perps found a way of getting their malware onto iPhones. Initially, this was through Apple's app-testing program Test Flight. When Apple put a stop to that, the crooks, believed to be based in China, found a workaround via mobile device management (MDM) software usually only used by IT managers to control employee iPhone usage.
This is another frightening example of how identity theft is exploiting new technology.
Cybercrooks may intercept facial recognition data when it's being sent over the internet. Or they might steal it during data breaches. And in Latin America, con artists are employing what's been called the "false face scam."
According to cybersecurity specialists Insside, the fraudsters use mannequins and photos of their victims to open bank accounts by fooling the verification process.
"Acquiring a user image is easier than you might think: simply obtain images of personal documents or search for photos online, print them life-size, and then attach them to a mannequin to use for facial verification during account opening," the firm explains.
Can You Protect Yourself Against Face Recognition Scams?
As we noted, facial recognition scams are currently active outside of the US. But, as a report on the tech consumer website Tom's Guide noted recently: "As with other malware campaigns, if this one proves successful, the cybercriminals behind it could expand their operations to target both iPhone and Android users in the US, Canada, and other English-speaking countries."
So, can your protect yourself against this risk?
If you use an Android phone, you should definitely have high-level security software installed and avoid downloading apps from anywhere but Google Play Store.
For iPhone users, Tom's Guide says don't install any apps through TestFlight.
"The same goes for adding a MDM profile to your iPhone," says the site. "Your employer is the only one that should be asking you to do this and that's only if you have a company-issued iPhone."
Beyond these specific actions, the emergence of facial recognition scams highlights the need to restrict the availability of your face images on the likes of social media sites.
Beware also of any app that asks you to scan your face. You need to be sure the app is legit and so is its need to have a picture of your face.
Finally, as always, you should regularly and frequently monitor your financial accounts and credit records for signs of activity you didn't initiate. Take immediate action to notify the relevant organization as well as law enforcement.
This Week's Alert
FraudGPT: Yes, scammers now have their own version of the artificial intelligence (AI) app ChatGPT. It's actually called FraudGPT and is available on subscription to crooks using the dark web. And there's another one called WormGPT. They've actually been around for a while, but security specialists are now getting anxious about their ability as the use of AI enters the mainstream.
"FraudGPT is described as a great tool for creating undetectable malware, writing malicious code, finding leaks and vulnerabilities, creating phishing pages, and for learning hacking," says security firm Trustwave.
One more reason to be skeptical of just about anything and everything you read online and in messages. Stop, think and question before you act.
That's it for today -- we hope you enjoy your week!