WormGPT is a malicious artificial intelligence tool based on the GPT-J large language model, developed in 2021 by EleutherAI. Unlike mainstream AI chatbots such as ChatGPT, which enforce strict ethical guidelines and content moderation, WormGPT was intentionally designed to remove these safeguards, allowing users to generate content for illegal and unethical activities without restriction.
Key Features
• No ethical or content restrictions: WormGPT can generate responses to requests involving cybercrime, including phishing, malware creation, and business email compromise (BEC) attacks.
• Unlimited character support: Users can generate long-form content without limitations.
• Chat memory retention: The tool remembers previous messages for more coherent conversations.
• Code formatting: WormGPT can generate and format code snippets, including malware and exploit scripts.
• Anonymity and privacy: Marketed on underground forums, WormGPT promised secure and confidential usage for cybercriminals.
• Multiple models: Users could select from various AI models for general or specialized use cases.
Use Cases
WormGPT was widely adopted in underground cybercrime communities for:
• Generating convincing phishing emails and social engineering content.
• Creating and formatting malicious code for malware or hacking tools.
• Assisting in business email compromise (BEC) scams by crafting fraudulent messages to deceive victims.
Background and Demise
WormGPT was first introduced on hacker forums in 2021 and gained significant attention in 2023 for its capabilities and lack of restrictions. It was sold via subscription, with prices ranging from €60–€100 per month, or €550 per year, and offered even more expensive private setups. The tool’s notoriety led to widespread media coverage, and eventually, its creator ceased sales, attempting to distance themselves from its criminal misuse.