FraudGPT and the rise of new AI-powered cybercrime tools

August 8, 2023
AI cybercrime tools
Dangerous Hooded Hacker Breaks into Government Data Servers and Infects Their System with a Virus. His Hideout Place has Dark Atmosphere, Multiple Displays, Cables Everywhere.

The creator of a new AI application called FraudGPT claims the tool can help you scam people out of their money. How effective it is at doing that is questionable but it may just be the tip of the dark AI iceberg.

If you ask ChatGPT to write an email purporting to be from a bank to be used in a phishing scam it will politely decline. FraudGPT, ChatGPT’s evil twin, has no problem helping you do that.

Proprietary AI models like ChatGPT and Llama have guardrails in place to stop them from being used for malicious ends. 

The creator of FraudGPT says his tool has no compunctions when it comes to crime and will happily write hacking code or scammer emails. And he’ll happily sell the tool to you on a subscription basis.

WormGPT vs FraudGPT

In early July online security fears were raised with the release of WormGPT. It was based on an LLM called GPT-J but with the guardrails stripped out. The model was nowhere near as adept as GPT-3 or GPT-4 so the naughty emails and code it wrote ended up being some fairly basic stuff.

Hot on its heels came FraudGPT which was released about 2 weeks ago. The creator of FraudGPT, who goes by the handle CanadianKingpin12, claims the new tool is based on GPT-3 and can “create undetectable malware.” 

If you’re that way inclined you’d have to contact him (or her?) on the dark web and pay $200 to access FraudGPT. CanadianKingpin12 claims, and screenshots seem to confirm, that he’s had over 3,000 sales already.

Some videos on Youtube give you an idea of what FraudGPT can do.

 

So malicious intent aside, the video demonstration is a little underwhelming. Maybe he left the best parts out. Or maybe something else is going on here.

Is FraudGPT scamming the scammers?

We’ve not played around with FraudGPT but some cybersecurity professionals that have checked it out aren’t losing any sleep over it.

In response to a request for FraudGPT to write a spam text message to defraud Bank of America clients it gave this output: “Dear Bank of America Member: Please check out this important link in order to ensure the security of your online bank account.”

Yeah, not exactly Bond villain-level stuff.

Melissa Bischoping, a researcher at cybersecurity firm Tanium, said, “I have seen nothing to suggest that this is scary.” 

Her opinion of the people behind FraudGPT is that they “are preying on people who are not sophisticated enough to actually write their own malware, but want to make a quick buck.”

That may be what’s happening here. Just some wannabe hackers/scammers pretending they’re badass using a naughty AI without getting much bang for their buck. 

But it does point to two concerning developments. Firstly, people are working on making AI cybercrime tools like this. And secondly, there’s a ready market for them.

Enter DarkBART and DarkBERT

CanadianKingpin12 says “But wait, there’s more!” He says he’s now ready to offer new cybercrime AI tools called DarkBART and DarkBERT. 

DarkBART is apparently a dark version of Google’s Bard AI. We’re guessing it’ll be similar to FraudGPT but powered by Google’s LLM rather than GPT-3. Is it any good at being bad? We don’t know yet.

DarkBERT is a bit more interesting. It may not be some new tool that CanadianKingpin12 made at all. It may just be that he’s got his hands on an LLM by that name that was created by legitimate researchers in South Korea.

That model was trained on a bunch of Dark Web data, a lot of which would be genuinely nasty stuff including a lot of hacking, ransomware, and scamming data. It was designed with the intention of detecting and handling cyber threats.

You can request access to the tool but only if you have an email address from an academic institution. Not exactly a foolproof screening method.

Either CanadianKingpin12 got his hands on DarkBERT and tweaked it, or it’s something else entirely. Either way, he was happy to show it off briefly in this video.

 

Is he just having a laugh and enjoying scamming the scammers? We hope so. But the demand for and development of AI cybercrime tools is unquestionable. As countries move to make AI weapons, we can be sure that criminals are also adding AI to their arsenal.

There are probably some really good AI cybercrime tools already available on the Dark Web if you know which people to ask. CanadianKingpin12 just doesn’t seem to be one of those people.

Join The Future


SUBSCRIBE TODAY

Clear, concise, comprehensive. Get a grip on AI developments with DailyAI

Eugene van der Watt

Eugene comes from an electronic engineering background and loves all things tech. When he takes a break from consuming AI news you'll find him at the snooker table.

×
 
 

FREE PDF EXCLUSIVE
Stay Ahead with DailyAI


 

Sign up for our weekly newsletter and receive exclusive access to DailyAI's Latest eBook: 'Mastering AI Tools: Your 2024 Guide to Enhanced Productivity'.



 
 

*By subscribing to our newsletter you accept our Privacy Policy and our Terms and Conditions