Deepfake crypto scams are projected to surpass $25 billion in losses in 2024, doubling the losses from the previous year, according to a report by Bitget. The report reveals a 245% increase in deepfake incidents worldwide, based on research data from Sumsub. Countries such as China, Germany, Ukraine, the United States, Vietnam, and the United Kingdom have been the most impacted, with a surge in deepfake cases in the first quarter of 2024. The crypto industry has experienced a 217% increase in losses compared to the first quarter of 2023, totaling over $6.3 billion. Bitget predicts that losses could reach $10 billion per quarter by 2025 and believes that deepfakes may account for over 70% of all crypto crimes within the next two years.
Deepfake fraudsters primarily target influential figures and celebrities to promote fraudulent investment schemes. By exploiting the likeness of these individuals, they give the false impression that the project has legitimate backing, making it more convincing for potential victims. For example, scammers recently used deepfake technology to impersonate Elon Musk during live-streamed YouTube channels, instructing viewers to send cryptocurrency to a specific address in exchange for double the amount. There have also been instances of deepfake technology impersonating high-level executives during online meetings, potentially authorizing large transactions that impact both the corporate and crypto sectors.
While deepfakes currently dominate the realm of AI-driven crypto crimes, there are other applications emerging. A recent report by Elliptic highlights the rise of AI-based crypto crimes, including deepfake scams, state-sponsored attacks, and other sophisticated illicit activities. The report warns of the use of large language models (LLMs) in dark web forums for crypto-related crimes, such as reverse-engineering wallet seed phrases and automating scams like phishing and malware deployment. Dark web markets offer unethical versions of GPTs (Generative Pre-trained Transformers) designed for AI crypto crime, aiming to avoid detection by legitimate GPTs.
In light of these developments, Elliptic calls for early detection and monitoring of illegal activities to ensure long-term innovation and mitigate emerging risks associated with AI-driven crypto crimes.