With cybercriminals becoming more adept in their methods, it is crucial for people to be aware of the latest scams that are being employed. One of the newer tactics being used by scammers is audio deepfake, where AI is used to replicate voices and faces to make scams appear more realistic and convincing. Irene Corpuz, a cybersecurity expert, highlighted a case where a British engineering company in Hong Kong lost around HK$200 million to criminals who used AI-generated video calls. This shows the severity of the issue and the need for individuals to be cautious.
To protect oneself from falling victim to audio deepfake scams, Corpuz advised avoiding answering yes or no questions from unknown callers. Scammers can use recorded affirmative answers to authorize fraudulent transactions or trick automated systems that use voice recognition for identity verification. Additionally, scammers can use verification tactics to make their calls seem more legitimate, such as pretending to be from banks or other reputable organizations. Remaining vigilant and not divulging personal information to unknown callers is essential in preventing such scams.
JD Ackley, CEO at Raizor, emphasized the importance of being wary of unsolicited calls and paying attention to the requests made by the caller. Scammers often try to extract payment in unusual terms, such as gift cards or money transfers, which legitimate businesses would not do. Ackley also advised individuals to ask for a call-back number and verify the legitimacy of the call before providing any sensitive information. By being aware of these red flags, individuals can avoid falling victim to audio deepfake scams.
Barney Almazar, a legal expert, noted that scammers often target individuals when their guard is down, making it easier for them to conceal their fraudulent intentions. During periods when it is challenging to reach bank hotlines, scammers take advantage of the situation to exploit victims. Almazar stressed the importance of education and awareness in combatting audio deepfake and other scams. Under UAE Cybercrime Law, stringent measures are in place to criminalize electronic fraud and impersonation, with severe penalties for offenders.
In conclusion, staying informed about the latest scams and being vigilant when dealing with unknown callers are crucial steps in protecting oneself from audio deepfake scams. By avoiding answering yes or no questions, being cautious of payment requests in unusual terms, and verifying the legitimacy of calls, individuals can prevent falling victim to cybercriminals. Education and awareness, as well as critical thinking, play key roles in combating the impact of audio deepfake scams. It is essential to verify information before taking any action and to report any suspicious activity to the relevant authorities.