In the ever-evolving landscape of fraud, Canadian businesses are facing a new and formidable threat: artificial intelligence (AI)- generated “deep fakes.”
A recent survey conducted by KPMG in Canada shows that more than nine in 10 organizations victimized by fraud are concerned about the risks posed by these sophisticated scams.
The research surveyed 300 Canadian organizations affected by fraud and revealed alarming figures on the growing concerns among business leaders.
A staggering 95% expressed grave concerns regarding the increasing prevalence of AI-generated deepfakes and their potential to heighten the risk of fraud within their companies.
Additionally, 91% fear generative AI could empower criminals to orchestrate corporate misinformation and disinformation campaigns using deepfakes.
The survey findings underscore the pervasive impact of fraud on Canadian businesses. Among the key highlights:
Enzo Carlucci, national forensic leader at KPMG in Canada, emphasized the multifaceted nature of modern fraud schemes.
“Respondents overwhelmingly told us the fraud landscape is becoming more complex, with 95% saying generative AI and social engineering scams make it easier for fraudsters to deceive, manipulate, misrepresent, and conceal their crime,” said Carlucci.
Organizations are increasingly turning to technological solutions to combat the escalating threat. Nearly half of the surveyed companies actively employ emerging technologies such as AI and advanced data analytics to mitigate fraud risks.
Marilyn Abate, a partner in KPMG’s Forensic and Financial Crimes practice, emphasized the importance of leveraging technology in the fight against fraud.
“Companies need to use AI to fight AI,” Abate said. “These tools are fast-becoming essentials in the fraud toolkit to prevent fraudsters from gaining the upper hand. But if you don’t perform regular fraud risk assessments to identify external and internal risks and vulnerabilities, you will always be at a disadvantage.”
Have something to say about this story? Leave a comment below.