2 min read

Deepfake Scams Are on the Rise

Deepfake Scams Are on the Rise

The FBI has issued a warning that cybercriminals have been applying for remote IT positions using deep fakes and personally identifiable information (PII) obtained from the internet to access systems, passwords, and sensitive data. According to a warning from the FBI, complaints about using deep fakes and stolen PII in remote job applications have grown significantly from last year.

Deepfakes are false computer-generated audio or visual representations of a real person, which are increasingly used in scams. The occupations most frequently targeted in this new scam are those in IT and computer programming. The imposters intend to get employed for the remote position to gain access to customer financial information, corporate IT databases, and proprietary data, all of which can subsequently be stolen.

To generate their deepfakes, criminals use credentials and pictures they have acquired on the internet. They can pose as a legitimate applicant in a virtual interview using fake video or audio. The FBI explains that fraudulent interviews contain specific telltale indicators that the interviewee might not be accurate.
“The actions and lip movement is seen when interviewed on camera do not completely coordinate with the audio of the person speaking,” the advisory notes. “At times, actions such as coughing, sneezing, or other auditory actions are not aligned with what is presented visually,” it continues.

Pre-employment background checks have also flagged that the information used to apply for the job belongs to another individual. Because it is simple to recreate an identity using leaked information, it is crucial to report stolen PII as soon as it occurs. According to the UK Information Commissioner’s Office (ICO), “your name, address, and date of birth provide enough information to create another ‘you.’”

Deepfakes are increasingly being used in cybercrime. Elon Musk, the CEO of Tesla, was used in a scam circulated on YouTube last month to defraud individuals of their cryptocurrencies, Bitcoin and Ethereum. Online thieves were taking over channels and accounts, changing the design to look like they were from Tesla, and publishing fraudulent deepfake videos of Musk encouraging viewers to take part in fake bitcoin offers. According to the BBC, the scammers made $243,000 in just over a week. YouTube has come under fire for its slow removal of false content.

Many people have expressed worry that artificial intelligence capabilities may be exploited for criminal objectives and have advised against their usage and development. While deepfakes can be used for lighthearted entertainment, cybercriminals also exploit the technology to cause significant harm. However, deepfake risks can impact anyone. Famous people are the easiest targets for deepfake fabrication, mainly if they are used to promote false information. Being skeptical of everything you see online is crucial to online safety and using the internet. Now that you know about the existence of deepfakes, you might think twice before you believe a video of a government official making an outrageous claim.

Securing the Future: Integrating Cybersecurity into Governance, Risk, and Compliance (GRC) Frameworks

Securing the Future: Integrating Cybersecurity into Governance, Risk, and Compliance (GRC) Frameworks

Integrating cybersecurity into governance, risk, and compliance (GRC) frameworks is more important than ever in the modern digital landscape, where...

Read More
The True Role of Data Backups in The Ransomware Era

The True Role of Data Backups in The Ransomware Era

Data backups are the unsung heroes when defending organizations against cyber-attacks. The idea that backups protect against ransomware assaults is...

Read More
Are remote workers at greater risk of cybersecurity threats?

Are remote workers at greater risk of cybersecurity threats?

The advent of hybrid work has revolutionized the dynamics of work, living, and consumer behavior, as highlighted by McKinsey's research. Office...

Read More