Skip to content

New AI Chatbot GhostGPT Threatens With Phishing Ease

GhostGPT is making phishing easy. Experts warn of increased threats as low-skilled hackers gain powerful tools.

In this image, we can see an advertisement contains robots and some text.
In this image, we can see an advertisement contains robots and some text.

New AI Chatbot GhostGPT Threatens With Phishing Ease

A new AI chatbot, GhostGPT, has emerged in the cybercrime underworld. It's designed to aid in malware creation and phishing emails, sparking concern among security experts. GhostGPT uses a wrapper to connect to jailbroken versions of ChatGPT or other open-source large language models. It's the latest in a series of malicious AI chatbots, following WormGPT's creation in 2023 for business email compromise (BEC) attacks. GhostGPT's creator, an anonymous hacker, began distributing it on Telegram around early 2023. Since then, it's gained significant traction, with thousands of views on online forums. Cybercriminals, even those with low skills, can now create convincing phishing emails, like a recent DocuSign template, with ease. Accessing GhostGPT is simple. It's available for purchase on Telegram, eliminating the need to jailbreak ChatGPT or set up an open-source model. Its sale was first observed at the end of 2024. GhostGPT's widespread availability and ease of use pose a significant threat. It empowers low-skilled cybercriminals to launch sophisticated phishing campaigns. Security experts urge vigilance and caution in the face of this new tool.

Read also:

Latest