The "Pig Butchering" scam, or Sha Zhu Pan, isn't new, but the tools used to execute it are getting terrifyingly advanced.
Once a labor-intensive operation built on stolen photos and generic scripts, this billion-dollar fraud—where victims are groomed for months before being tricked into investing cryptocurrency on a fake platform—is now being optimized by cutting-edge, black-market AI.
Forget the days of scammers failing video checks. We’re now dealing with custom deepfake software sold for serious money, designed to bypass every common security test. If you or a loved one are on dating apps, social media, or investing in crypto, this is mandatory reading.
The New Arsenal: “God-Level Assistance” for Fraud
Criminal organizations, often operating out of Southeast Asian scam compounds and utilizing vast Huione Guarantee marketplaces, are upgrading their toolkits with proprietary AI solutions. They’ve found a critical vulnerability in the scam lifecycle—the inability to convincingly appear on a live video call—and they’re plugging it with “Deepfake-as-a-Service.”
Enter Haotian AI. This deepfake software is being advertised on Telegram channels catering specifically to scam bosses, promising to solve the biggest operational problem. Previously, sophisticated video creation required days of training and specialized skill. Haotian AI claims to offer a “one-button operation,” making highly realistic, real-time face-swapping accessible to anyone willing to pay. This custom AI doesn’t come cheap, with prices ranging from $1,200 to $9,900 for “God-Level Assistance.” This significant investment is proof that the ROI from a successful “slaughter” is massive. The advertising copy reveals the sinister purpose: “The chat lacks authenticity? No Trust?... After all, how could such a beautiful girl lie, right?” The AI is explicitly used to manufacture emotional trust, playing on human psychology to lower the victim’s guard before the crypto pitch begins.
Bypassing the Video Test
You may have heard police advise victims to ask a suspicious video caller to wave their hand in front of their face, as early deepfakes would glitch. That advice is now dangerously obsolete. Haotian AI specifically advertises that its deepfake will “not glitch” when asked to cover or pinch the face, creating a seamless, realistic interaction designed to eliminate victim suspicion. If the persona is believable, the scammer can proceed to the financial phase. The cryptocurrency aspect remains essential, as the transactions are irreversible, and the funds are quickly moved through complex chains on the blockchain.
LLMs and the Crypto Buzzword Blurring
The emotional grooming phase is also now a scalable operation thanks to large language models (LLMs). These tools allow a single scammer to manage dozens of deep, emotionally resonant conversations simultaneously, maintaining the illusion of a deep relationship over months. However, the AI isn’t perfect; some scammers have been caught accidentally pasting direct outputs that begin with bureaucratic language like, “As a large language model, I cannot...”—a crucial slip-up that can expose the automation behind the affection. Once the emotional groundwork is complete, the scam pivots to investment. Fraudulent crypto platforms are using the hype surrounding AI to lure victims, incorporating buzzwords like “AI,” “arbitrage,” “web3,” and “quantum” into their website URLs and marketing scripts. They promise an “AI intelligent trading system” that automatically generates high returns, even when the underlying website is a generic, re-used template that shows no real connection to AI technology.
Ultimate Red Flags: The Defense is Your Skepticism
As the technology gets better, your defense needs to shift from technical checks to behavioral and platform analysis.
Prevention is the only cure.
First, be extremely suspicious of any new online acquaintance who quickly professes strong feelings, shares an intimate level of financial detail, and brings up investing. That remains the oldest and most reliable red flag.
When dealing with video, since the “wave test” is dead, look for subtle unnatural movements like stiff head turns or inconsistent lighting that doesn’t match the background. If the person’s face looks too smooth, or the overall quality is uncanny, trust your gut.
Finally, never fall for the financial manipulation. If a stranger asks you to move funds into cryptocurrency and transfer it to a proprietary, unknown platform—especially one promising “AI-driven arbitrage”—it’s a scam. If you try to withdraw your profits and are told you need to pay a “tax,” a “security deposit,” or a “transfer fee” first, you have been scammed.
AI is transforming this crime from a scam of human effort into a scam of technological scale. Stay vigilant, educate your family, and remember the golden rule:
If a stranger you met online offers you guaranteed, easy profits, the only guarantee is that they are the butcher, and you are the pig.
Yes, it sounds horrible.
But it’s true.