New AI Chatbot GhostGPT Threatens With Phishing Ease
A new AI chatbot, GhostGPT, has emerged in the cybercrime underworld. It's designed to aid in malware creation and phishing emails, sparking concern among security experts. GhostGPT uses a wrapper to connect to jailbroken versions of ChatGPT or other open-source large language models. It's the latest in a series of malicious AI chatbots, following WormGPT's creation in 2023 for business email compromise (BEC) attacks. GhostGPT's creator, an anonymous hacker, began distributing it on Telegram around early 2023. Since then, it's gained significant traction, with thousands of views on online forums. Cybercriminals, even those with low skills, can now create convincing phishing emails, like a recent DocuSign template, with ease. Accessing GhostGPT is simple. It's available for purchase on Telegram, eliminating the need to jailbreak ChatGPT or set up an open-source model. Its sale was first observed at the end of 2024. GhostGPT's widespread availability and ease of use pose a significant threat. It empowers low-skilled cybercriminals to launch sophisticated phishing campaigns. Security experts urge vigilance and caution in the face of this new tool.
Read also:
- Trump and Xi speak over the phone, according to China's confirmation.
- NVIDIA introduces Blackwell to the cloud and unveils the significant enhancement of GeForce Now at Gamescom 2025, marking a major step in cloud gaming technology.
- Strategies for Adhering to KYC/AML Regulations in India, a Leading Fintech Center (2024)
- Strategies for Poland, Ukraine, and NATO to combat unmanned Russian aerial threats.