Hackers are using a new AI chatbot to wage attacks: GhostGPT lets users write malicious code, create malware, and curate phishing emails – and it costs just $50 to use
Hackers are using an uncensored chatbot dubbed GhostGPT to help write malware, highlighting how AI can be twisted to “illegal activities”.
That’s according to Abnormal Security, which laid out details of GhostGPT in a blog post, saying the chatbot lacks the guardrails of standard AI tools such as ChatGPT, making it a helpful tool for cyber criminals.
It’s not the first hackbot-as-a-service, however. WormGPT arrived in 2023 offering a similar chatbot subscription service for writing phishing emails and business email compromise attacks.
That, Abnormal Security noted, was followed by WolfGPT and EscapeGPT, suggesting GhostGPT is a sign malicious actors see value in AI helping them commit cyber crime.
The security company explained that GhostGPT was specifically designed for cyber crime purposes and that enterprises should be wary of its potential looking ahead.
“It likely uses a wrapper to connect to a jailbroken version of ChatGPT or an open source large language model (LLM), effectively removing any ethical safeguards,” the blogpost explained.
“By eliminating the ethical and safety restrictions typically built into AI models, GhostGPT can provide direct, unfiltered answers to sensitive or harmful queries that would be blocked or flagged by traditional AI systems.”
What can GhostGPT do?
Abnormal shared a screenshot of an advertisement for the GhostGPT service that claimed the chatbot was fast and easy-to-use, offered uncensored responses, and had a strict no-logs policy, saying “protecting our users’ privacy is our top priority.”
Abnormal noted that GhostGPT was marketed for coding, malware creation, and exploit development, but could also be used to write material for business email compromise (BEC) scams. The advertisement noted its various features make GhostGPT “a valuable tool for cybersecurity and various other applications.”
“While its promotional materials mention “cybersecurity” as a possible use, this claim is hard to believe, given its availability on cyber crime forums and its focus on BEC scams,” the Abnormal post added.
“Such disclaimers seem like a weak attempt to dodge legal accountability — nothing new in the cybercrime world.”
Indeed, Abnormal’s researchers asked GhostGPT to write a phishing email; it outputted a template that could be used to trick victims.
Easy access for all hackers
GhostGPT is accessible as a Telegram bot, making it easy for attackers to make use of without having technical skills or taking the time to set up their own systems, Abnormal noted.
“Because it’s available as a Telegram bot, there is no need to jailbreak ChatGPT or set up an open source model,” the blog post noted. ” Users can pay a fee, gain immediate access, and focus directly on executing their attacks.”
A report in DarkReading noted that prices for GhostGPT were relatively cheap, too: $50 for a week, $150 for a month, and $300 for three months.
Fresh challenge for security
By lowering the barrier of entry to would-be hackers, such chatbots make it easier for cyber criminals without extensive skills to attack anyone — potentially sparking a real challenge for personal and organizational security.
Chatbots also make it faster and easier to launch cyber crime campaigns by enabling threat actors to create more effective malware, realistic-looking phishing emails, and so on.
“With its ability to deliver insights without limitations, GhostGPT serves as a powerful tool for those seeking to exploit AI for malicious purposes,” Abnormal said.
Because cyber criminals are shifting to AI, so too must security professionals, says Abnormal, as tools like GhostGPT will make it easier to slip phishing emails and malware past traditional filters.
Source link