SALT LAKE CITY — If you’ve been through IT training at work, you know the drill: Don’t reuse your password. Don’t assume an email is real. Don’t click links.
All good advice, as scammers have become experts at tricking workers into accidentally opening the back door for them to get in and do their dirty deeds. But scammers have a new workaround — going in through the front door.
“Attackers are using AI to apply for jobs at companies and actually get those jobs,” said Brian Long, CEO of Adaptive Security, a cybersecurity firm that works with companies to keep the bad guys out.
In some ways, it’s never been easier to be a digital con man with the combination of the rapid development of artificial intelligence and this post-pandemic world where a lot of people work remotely.
Here, that AI will be used to create photo IDs and resumes for those fake job seekers, and even to come up with answers during their job interviews. Once they land the job, they’ll attack the company from within.
“They have access to internal systems, documents [and] payroll,” said Long.
Ferreting out the fakes
Long said companies need to have controls in place on how they hire people. It’s highly preferable to find a way to meet in person. If that’s not possible, set up a video call, and get weird. Ask for something an AI deep fake would have a hard time pulling off.
“Have someone whistle, or sing, or twist around,” suggested Long.
And don’t accept that blurred background nonsense from a job seeker’s video call. Make them demonstrate that they’re in a real room.
An AI deepfake can smile and wave all it wants, but can it turn off the lamp behind it? Sure, it might seem silly to make someone jump through a couple of hoops during a video call, but it could be the difference in knowing that you’re talking to an actual live human being instead of a deepfake persona.