Most crimes, at the end of the day, are crimes of opportunity. After all, bad actors and scammers just need to get lucky once. And with the resurgence of the cryptocurrency market, as bitcoin tops $70,000 and even meme coins start to make a comeback, opportunity is everywhere.
This, as a new crime report from the FBI’s Internet Crime Complaint Center (IC3) shows that Americans made over 43,000 complaints about cryptocurrency scams last year. Those complaints were far from unfounded: losses to crypto-based frauds and scams rose to $3.9 billion, up 53% year-over-year.
“These scams are designed to entice those targeted with the promise of lucrative returns on their investments,” the FBI noted.
Scam factories, where criminals traffic tens of thousands of individuals, confine them to compounds and force them to conduct online scams targeting unsuspecting foreign nationals, are a driving force behind the rise in crypto scams. One popular tactic is “pig butchering,” where criminals use fictitious identities to develop relationships and build rapport with victims using dating apps, social media platforms, professional networking sites, or encrypted messaging apps.
The schemes are socially engineered to build trust, usually beginning with a romance or confidence scam and evolving into cryptocurrency investment fraud — when the “pig,” after being fattened up, gets “butchered.”
Per the FBI’s report, business email compromise attacks were the second most popular cyber-crime tactic, netting criminals $2.9 billion from American victims.
Read more: Scam Factories Exploit Advanced Tech to Amplify Payments Fraud
Scammers Scale Their Attacks in Search of Easy Targets
The staggering global scale of these contemporary cryptocurrency scams underscores an urgent need for greater awareness and vigilance among the general public. As the popularity of cryptocurrencies continues to grow, it’s crucial that individuals remain vigilant and skeptical if an unknown number or individual attempts to contact them.
That’s because, once contact has been established, bad actors can choose from numerous tactics in which to entangle their victims.
As another press release from the FBI illustrates, criminals are even creating fake gaming apps to steal millions of dollars in cryptocurrency, advertising the apps as play-to-earn games that offer financial incentives to players.
How this particular scam works is that bad actors — having established a connection with their victim — direct the target to create a cryptocurrency wallet, purchase cryptocurrency, and join a specific game app. The more money the victim has in their wallet, the more rewards they will earn, and as victims play the game, they see fake rewards accumulating in the app.
However, what ends up happening is the criminals then drain the wallets using a malicious program that victims unknowingly activated upon joining the game — and while victims are often told they can reclaim their money by paying an additional fee, that is just another iteration of the same scam.
Tools like artificial intelligence (AI) are helping bad actors to systematically industrialize their attacks, expanding their reach across the world via digital platforms.
See also: Crypto Continues to Serve as Case Study in Behavioral Economics
If It’s Too Good To Be True, It Probably Is
The crypto marketplace is already notorious for Ponzi-like schemes and other scams — and bad actors are increasingly taken advantage of victims who have lost cryptocurrency to fraud, scams and theft with recovery schemes that are themselves fraudulent.
Per the FBI, representatives of fraudulent businesses claiming to provide cryptocurrency tracing and promising an ability to recover lost funds are turning to social media and other messaging platforms to contact victims directly about solving their lost assets.
These recovery scheme fraudsters charge an up-front fee and either cease communication with the victim after receiving an initial deposit, or produce an incomplete or inaccurate tracing report and request additional fees to recover funds. Fraudsters may claim affiliation with law enforcement or legal services to appear legitimate.
In a recent interview with PYMNTS, Kate Frankish, chief business development officer and anti-fraud lead at Pay.UK, pointed to how digital technologies such as AI deepfake images are enabling fraudsters to mimic individuals with exceptional precision, making it difficult for even tech savvy individuals to tell what’s real.