LastPass: Hackers targeted employee in failed deepfake CEO call

By Sergiu Gatlan, Bleeping Computer

LastPass revealed this week that threat actors targeted one of its employees in a voice phishing attack, using deepfake audio to impersonate Karim Toubba, the company's Chief Executive Officer.

However, while 25% of people have been on the receiving end of an AI voice impersonation scam or know someone who has, according to a recent global study, the LastPass employee didn't fall for it because the attacker used WhatsApp, which is a very uncommon business channel.

"In our case, an employee received a series of calls, texts, and at least one voicemail featuring an audio deepfake from a threat actor impersonating our CEO via WhatsApp," LastPass intelligence analyst Mike Kosak said.


Deepfake audio LastPass CEO impersonation

"As the attempted communication was outside of normal business communication channels and due to the employee’s suspicion regarding the presence of many of the hallmarks of a social engineering attempt (such as forced urgency), our employee rightly ignored the messages and reported the incident to our internal security team so that we could take steps to both mitigate the threat and raise awareness of the tactic both internally and externally."

Kosak added the attack failed and had no impact on LastPass. However, the company still chose to share details of the incident to warn other companies that AI-generated deepfakes are already being used in executive impersonation fraud campaigns.

The deepfake audio used in this attack was likely generated using deepfake audio models trained on publicly available audio recordings of LastPass' CEO, likely this one available on YouTube.


Comments

Popular posts from this blog

New MFA-bypassing phishing kit targets Microsoft 365, Gmail accounts

Ransomware gang starts leaking alleged stolen Change Healthcare data

Why remote desktop tools are facing an onslaught of cyber threats