AI Scam Factory: How Deepfakes and Smart Fraud Stole $4.6 Billion in Crypto
Surpassing forecasts, cryptocurrency scam losses reached $4.6 billion worldwide in 2024. The data comes from the 2025 Anti-Scam Research Report, produced through collaboration between Bitget - a digital currency platform and Web3 company - and cybersecurity firms SlowMist and Elliptic.
The Report Findings
Out in June 2025, just as Bitget kicks off its Anti-Scam Month, the document uses on-chain tracing, real examples, also patterns of behavior to sketch today’s scam scene.
At the center of the finding is how scams have shifted. Scam emails made by AI used to be obvious, full of mistakes. Today tricks include video calls with synthetic people, altered clips of famous figures speaking words they never said, also hidden harmful software placed in job offers that seem genuine - each move meant to build trust or break into devices. Certain schemes appear so true that even experienced crypto users still get caught off guard.
One reason people lose money is deepfake scams that mimic real individuals. These fake identities trick users by appearing trustworthy. Another path involves manipulation through personal connections, where scammers build false rapport. Instead of clear lies, they blur lines using emotional cues. A third type is Ponzi-style projects hiding behind names that sound like official crypto platforms or digital art drops. Though each method targets a different layer of trust - identity, relationship, and investment credibility - they all look more and more like normal transactions. Spotting fakes now demands a sharp eye.
How Stolen Crypto Disappear
One of the most technically valuable sections of the report - contributed by Elliptic - dives into the mechanics of moving stolen cryptocurrency. After scams pull off their theft, money often flows through cross-chain bridges first. From there, it bounces through obfuscation tools. Only later does it land in mixers or exchange platforms - once it hits those spots, getting it back stops being realistic.
Cross-chain bridges let people transfer assets between different blockchain networks. These tools started out as legitimate infrastructures that help networks talk better together. Yet later, shady players found a loophole - shifting money around splits up the audit trail. Once cash touches down on a new chain, it wipes its past clean, making blockchain tracking significantly harder.
Mixers - also called tumblers - scrambles transaction trails by combining digital cash from many people before sending it onward, making origins hard to trace. Even if some use these tools just to shield their data, criminals often rely on them to clean tainted cryptocurrency. Authorities across different regions now block certain mixer platforms, yet every time one gets shut down, others pop up in its place.
"Criminals are constantly evolving their methods of attack, using AI and finding new ways to scale their activities," noted Arda Akartuna, lead crypto threat researcher at Elliptic APAC. "This means that reciprocally, we are also working to scale our technology and blockchain capabilities to track and identify the new methods criminals are using."
One thing SlowMist did was break down how certain scams actually work. Not just summaries - detailed looks at sneaky methods. Take address poisoning for instance. It crafts a wallet address that looks almost identical to one you’ve used before. Their hope? You’ll paste the wrong one later without noticing. Another method they examined is fake job offers hiding malicious software. These traps often target developers or those who work remotely, especially those involved in decentralized platforms.
Deepfakes, Social Engineering, and the Weaponization of Identity
To understand why 2024 represented such an escalation, it helps to look at what deepfake technology has become capable of - and how cheaply that capability is now available.
Fake videos or audio made by artificial intelligence can look just like someone real. Once only possible with big budgets or secret tech, these tools now sit on everyday phones. Because of open software and apps anyone can reach, making such fakes takes little skill today. Pictures of company leaders appear saying things they never said. Voices of investors get copied in phone talks about money matters. Even clips styled like TV news help lies spread inside cryptocurrency circles.
A closer look at Hong Kong reveals real cases that stand out, showing how deepfake scams have triggered strong responses from police. Some people found themselves pulled into lengthy connections with made-up identities, only to lose money after trusting them - weeks or even months passing in between. This matches a tactic known among experts as pig butchering: slow-building deception where trust grows bit by bit until the scam unfolds.
Fake DeFi, Fake NFTs, and Hype Cycle Exploitation
Ponzi-like schemes hiding behind DeFi or NFT labels make up the report’s third big group. While these scams existed long before artificial intelligence took off, today’s tech noise gives them more cover to spread.
Decentralized finance (DeFi) refers to financial services and products built on public blockchains, designed to operate without traditional intermediaries like banks. Real ones let people loan, swap, or earn yield freely. Yet because of the open access, scammers can always deploy a fraudulent protocol that seem legit, with with a professional-looking interface, fabricated audit reports, and social media presence. Afterwards, they do what the industry commonly calls a rug pull - attracting user investment before abruptly withdrawing all liquidity and abandoning the project.
NFTs turned into a favorite tool for scam artists just like during the 2021–2022 boom. Even though interest has faded lately, the report shows fake NFT schemes still pose real risks - especially now tech finds fresh uses in games, intellectual property, and digital identity. While honest projects promise community belonging and possible gains, scammers keep finding ways to ride that same wave.
The Broader Impact on the Crypto Ecosystem
Fraud on such a scale slows momentum when it comes to widespread acceptance, shakes confidence from oversight bodies, while also making established players hesitate before stepping in.
Trust slips when big scams hit the news, shaking trust among everyday users. Legit projects find it tougher to bring people in once fear takes hold. Headlines about counterfeit videos or bogus DeFi apps keep piling up, coloring the whole field in suspicion. That old image of crypto as lawless frontier sticks around, no matter how much effort goes into changing it.
What sticks out for regulators is how clearly this report shows self-policing doesn’t cut it anymore. Tech-driven scams now move too fast, reach too far, leaving old-style oversight behind. Since fraud trends like these emerged, regions such as the EU under its MiCA rules, along with watchdogs across parts of Asia-Pacific, began strengthening Know Your Customer and Anti-Money Laundering checks. Behind the scenes, organized fraud networks operate across borders, something else the document highlights. Because illegal setups stretch through countries with mismatched laws, chasing them becomes harder, tangled between different systems.
Meanwhile, institutional players worry about scams messing up their rules and public image, which slows how fast they jump in. New tools for holding digital assets safely must grow alongside policies protecting against losses.
Efforts Underway
Later parts of the report spotlight Bitget’s built-in safeguards. Its Anti-Scam Hub works alongside live monitoring tools to guard users. A safety net worth more than half a billion dollars backs these efforts. That fund kicks in if major system breaches occur. Such reserves have since popped up across other trading sites. Pressure has been rising for platforms to take monetary responsibility when user holdings are compromised.
Not just one voice but three - an exchange, a blockchain analysis team, a cyber-watch group - shaping the same document. That mix hints at change: rivals now leaning closer when danger moves. This framing aligns with how cybersecurity has evolved in other sectors, where information sharing between competitors on active threat intelligence has become standard practice.