In this scam operation, AI is responsible for dating and also for creating fake lawyer licenses.

robot
Abstract generation in progress

Author: Curry, Deep Tide TechFlow

OpenAI recently released a report about people using ChatGPT for malicious purposes and being caught.

The report is lengthy and lists numerous cases of AI abuse. Some involve Russian disinformation campaigns, suspected espionage using social engineering, but today I want to discuss one particular case:

Cambodian “Pig Butchering” scams.

Pig butchering scams are not new; many of you have heard countless stories about the Cambodian scam parks. What’s unusual here is the role AI played.

Within this scam operation, ChatGPT handled dating conversations, translated supervisor instructions, wrote daily work reports, and estimated the value of each victim.

In pig butchering scams, there’s an internal term called “kill value,” which is the estimated amount of money that can be extracted from a victim.

Across the entire pipeline, ChatGPT might be the busiest employee.

OpenAI gave this case a codename: Operation Date Bait.

The process is as follows:

The scam gang first created a fake high-end dating service called Klub Romantis, with a logo generated by ChatGPT. They then ran paid ads on social media targeting keywords like golf, yachts, and fine dining, specifically aimed at young Indonesian men.

When you click the ad, you first chat with an AI chatbot. The bot, posing as a sexy receptionist, asks what type of girl you like. After you choose, it provides a Telegram link with a special invite code.

Once on Telegram, a real person takes over.

The handler continues to use ChatGPT to generate flirtatious messages, becoming more and more explicit, then guides you to two fake dating platforms, called LoveCode and SexAction.

On these platforms, there are fake profiles of women and a scrolling message bar constantly broadcasting “Congratulations to so-and-so for completing a task, unlocking a bonus.” All of this is fabricated; experienced internet users might see through it immediately, but not all targets are so discerning.

Once the conversation reaches a certain point, the handler transfers you to a “mentor.” The mentor then assigns you “tasks,” each requiring payment, with amounts increasing each time—buying VIP cards, voting for “favorite girls,” paying hotel deposits, and so on.

The final step, internally called “kill,” involves fabricating a reason—such as data processing errors or deposit verification—to get you to transfer a large sum at once. OpenAI included a letter from the scam gang to victims in the report, demanding 20.5 million Indonesian rupiah, roughly $1,200 USD, promising a 35% bonus if paid.

Once the money is received, the scammer on Telegram will block you and mark the case as closed.

At this point, you might think it’s nothing new.

The scam techniques themselves aren’t innovative; pig butchering scams have been exposed many times over the past few years. What’s truly startling is the backend.

OpenAI investigators pieced together a complete corporate structure from the usage logs of these ChatGPT accounts:

The scam operation is divided into three departments: Lead Generation, Customer Service, and Management. The Lead Generation team runs ads to attract targets; Customer Service handles conversations to build trust; Management oversees the final extraction.

They produce daily reports listing each active victim, the responsible person, the current stage, and a number called “kill value”—the estimated final amount that can be extracted from that individual.

They also use ChatGPT to analyze financial accounts, generate work reports, and even ask ChatGPT how to connect APIs or modify dating website code. When managers speak Chinese and employees speak Indonesian, ChatGPT handles the translation.

It’s amusing that one scam worker openly asked ChatGPT about tax issues after earning money, honestly listing “scammer” as their occupation.

OpenAI’s report is quite restrained, stating that based on the scam gang’s own input records, they might be handling hundreds of targets simultaneously, earning thousands of dollars daily. However, the report also notes that these figures cannot be independently verified.

But I believe that even without verifying the numbers, just looking at this management process is revealing:

Lead generation, conversion, customer value, daily reports, departmental roles—change the terminology, and it looks like an operational manual for a SaaS company.

And the activities—dating, translation, daily reporting, coding, accounting—most of the work in this operation is done by a single ChatGPT account.

The story doesn’t end there.

OpenAI’s report also details a second operation, codenamed Operation False Witness, also originating from Cambodia.

This branch targets victims who have already been scammed before.

The logic is simple: if you’ve been defrauded by a pig butchering scam and want to recover your losses, you search online for solutions.

Then you see an ad for a law firm claiming to help scam victims recover their money. You click.

The website looks very legitimate. Some lawyer photos are stolen from social media, others are AI-generated. Each law firm has an address, a license, and a profile. ChatGPT even generated a fake membership card for the New York State Bar Association and fabricated lawyer registration records.

OpenAI identified at least six fake law firms.

There’s also a website that directly impersonates the FBI Internet Crime Complaint Center. It has a “Submit Complaint” button, which redirects to a Telegram account.

On Telegram, the “lawyer” begins chatting with you. The language is generated by ChatGPT, specifically crafted in “American English” with a professional tone. They tell you they cooperate with the International Criminal Court and that recovery services are free before payment.

But you must pay a 15% service fee upfront via cryptocurrency to activate your account.

They also ask you to sign a confidentiality agreement, also written by ChatGPT, designed to prevent you from seeking outside verification.

The FBI later issued a public warning about this scam, noting it mainly targets elderly victims, exploiting their urgent desire to recover losses.

After reviewing these two cases, I think the most ironic part in today’s AI-enabled environment is this:

The first time they scam you, you are just a target. The second time, you are a better target because you’ve already proven you can be fooled.

Finally, OpenAI summarized the scam process into three steps in the report:

First is “ping”—a cold outreach to get the target’s attention.
Second is “zing”—stirring emotions to make you excited, anxious, or fearful.
Third is “sting”—the final extraction, taking your money.

This framework is quite well summarized. Look closely—at which step can AI not be involved?

In traditional pig butchering scams, the biggest cost was human labor. You had to hire a bunch of people to chat at computers, speaking the target’s language. Early on, Cambodian scam parks even recruited English speakers with high wages.

Now, according to the report, the managers speak Chinese, the workers speak Indonesian, and the targets are Indonesians. The three parties don’t share a common language—something impossible to do without AI. Add ChatGPT, and everything becomes seamless.

Language is just one aspect.

The report also mentions that scam workers even asked ChatGPT how to connect to OpenAI’s API, aiming to fully automate the chat process.

In other words, AI isn’t making the scams more sophisticated; it’s making them cheaper.

Today, according to OpenAI, this gang might be handling hundreds of scams simultaneously. The scale increases, and the cost per victim decreases, allowing them to target more people with smaller amounts.

Another question I think is worth pondering:

OpenAI can detect these scams because the scam gang uses ChatGPT, and their chat logs are stored on OpenAI’s servers.

But what about those using locally deployed open-source models?

What this report shows is only a small piece of the puzzle that OpenAI can see. The larger part—how much is hidden, and what’s happening there—no one knows.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
  • Pin

Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский язык
  • Français
  • Deutsch
  • Português (Portugal)
  • ภาษาไทย
  • Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)